You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to build a pipeline for moving data from postgres to clickhouse, setting dataset_table_separator to a single underscore _ throws an exception. I am setting the dataset_table_separator like below:
From what I understand, dlt tries to create dlt_dlt_sentinel_table table twice in the pipleline.run() process. using different combination of values for dataset_sentinel_table_name and dataset_name did not work either.
I am wondering if this is an intended behaviour or not.
Setting the dataset_table_separator in the secrets.toml file results in the same error too.
Expected behavior
Expected behaviour is a running pipeline moving data from postgres into a clickhouse table with _ being the separator character between dataset_name and table_name.
dlt version
1.2.0
Describe the problem
I am trying to build a pipeline for moving data from postgres to clickhouse, setting
dataset_table_separator
to a single underscore_
throws an exception. I am setting thedataset_table_separator
like below:Here is the full error message:
From what I understand, dlt tries to create
dlt_dlt_sentinel_table
table twice in thepipleline.run()
process. using different combination of values fordataset_sentinel_table_name
anddataset_name
did not work either.I am wondering if this is an intended behaviour or not.
Setting the
dataset_table_separator
in thesecrets.toml
file results in the same error too.Expected behavior
Expected behaviour is a running pipeline moving data from postgres into a clickhouse table with
_
being the separator character betweendataset_name
andtable_name
.Steps to reproduce
Here is a code snippet to recreate the issue:
Operating system
macOS
Runtime environment
Local
Python version
3.11
dlt data source
Postgres
dlt destination
Clickhouse
Other deployment details
No response
Additional information
No response
The text was updated successfully, but these errors were encountered: