We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
UInt Type Not Supprot to Read Or Write on connector
master
No response
Clickhouse Create Table SQL
CREATE TABLE test.persona_wide_table ( `id` UInt64, `col1` String, `col2` String, `col3` String ) ENGINE = MergeTree ORDER BY id SETTINGS index_granularity = 8192;
Flink SQL Create And Select SQL
CREATE TABLE test ( `id` DECIMAL, `col1` String, `col2` String, `col3` String ) WITH ( 'connector' = 'clickhouse', 'url' = 'jdbc:ch://xxxx:8123?useSSL=false', 'database-name' = 'test', 'table-name' = 'persona_wide_table', 'sink.batch-size' = '500', 'sink.flush-interval' = '1000', 'sink.max-retries' = '3' );
ClickHouseDataType
The text was updated successfully, but these errors were encountered:
Hi @czy006 : I'm working on data conversion logic for compatibility with clickhouse-jdbc 0.6. Can you assign this issue to me?
clickhouse-jdbc 0.6
Sorry, something went wrong.
[Fix] Adjust data conversion logic for compatibility with clickhouse-…
7cce8b1
…jdbc 0.6 #136
Hi @czy006
I'm going to take over this issue if there are no more questions. Hope to get your feedback.
Merge pull request #145 from /issues/136
cf34ed6
[Fix] Adjust data conversion logic for compatibility with clickhouse-jdbc 0.6 #136
czy006
Successfully merging a pull request may close this issue.
What happened?
UInt Type Not Supprot to Read Or Write on connector
Affects Versions
master
What are you seeing the problem on?
No response
How to reproduce
Clickhouse Create Table SQL
Flink SQL Create And Select SQL
ClickHouseDataType
Relevant log output
No response
Anything else
Are you willing to submit a PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: