Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration test for arrow record batch <-> postgres row round trip conversion #38

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
110 changes: 61 additions & 49 deletions .github/workflows/pr.yaml
Original file line number Diff line number Diff line change
@@ -1,50 +1,62 @@
---
name: pr

on:
pull_request:
branches:
- main

jobs:
lint:
name: Clippy
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- uses: dtolnay/rust-toolchain@stable

- name: Install Protoc
uses: arduino/setup-protoc@v3
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}

- run: cargo clippy --all-features -- -D warnings

build:
name: Build
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- uses: dtolnay/rust-toolchain@stable

# Putting this into a GitHub Actions matrix will run a separate job per matrix item, whereas in theory
# this can re-use the existing build cache to go faster.
- name: Build without default features
run: cargo check --no-default-features

- name: Build with only duckdb
run: cargo check --no-default-features --features duckdb

- name: Build with only postgres
run: cargo check --no-default-features --features postgres

- name: Build with only sqlite
run: cargo check --no-default-features --features sqlite

- name: Build with only mysql
run: cargo check --no-default-features --features mysql
name: pr

on:
pull_request:
branches:
- main

jobs:
lint:
name: Clippy
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- uses: dtolnay/rust-toolchain@stable

- name: Install Protoc
uses: arduino/setup-protoc@v3
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}

- run: cargo clippy --all-features -- -D warnings

build:
name: Build
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- uses: dtolnay/rust-toolchain@stable

# Putting this into a GitHub Actions matrix will run a separate job per matrix item, whereas in theory
# this can re-use the existing build cache to go faster.
- name: Build without default features
run: cargo check --no-default-features

- name: Build with only duckdb
run: cargo check --no-default-features --features duckdb

- name: Build with only postgres
run: cargo check --no-default-features --features postgres

- name: Build with only sqlite
run: cargo check --no-default-features --features sqlite

- name: Build with only mysql
run: cargo check --no-default-features --features mysql

integration-test:
name: Integration Test
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- uses: dtolnay/rust-toolchain@stable

- name: Run integration test
run: cargo test --test integration --no-default-features --features postgres -- --nocapture
3 changes: 3 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,9 @@ datafusion-federation-sql = { git = "https://github.com/spiceai/datafusion-feder
itertools = "0.13.0"

[dev-dependencies]
anyhow = "1.0.86"
bollard = "0.16.1"
rand = "0.8.5"
reqwest = "0.12.5"
secrecy = "0.8.0"
tracing-subscriber = { version = "0.3.18", features = ["env-filter"] }
Expand Down
6 changes: 5 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -7,4 +7,8 @@ test:

.PHONY: lint
lint:
cargo clippy --all-features
cargo clippy --all-features

.PHONY: test-integration
test-integration:
cargo test --test integration --no-default-features --features postgres -- --nocapture
2 changes: 1 addition & 1 deletion src/sql/arrow_sql_gen/postgres.rs
Original file line number Diff line number Diff line change
Expand Up @@ -269,7 +269,7 @@ pub fn rows_to_arrow(rows: &[Row]) -> Result<RecordBatch> {
};
let v = row.try_get::<usize, Option<Value>>(i).with_context(|_| {
FailedToGetRowValueSnafu {
pg_type: Type::TIME,
pg_type: Type::JSON,
}
})?;

Expand Down
Loading