-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Actor Catalog Fetch Event Duplicate Key exception #21208
Comments
I confirm that by using row_number as @michelgalle stated, the problem goes away at least from a UI standpoint. |
Thanks for the report, and the investigation! It seems like the 2 rows with the same timestamp are happening because the frontend is actually requesting schema discovery twice. This isn't intended, but it shouldn't case the backend to fail like this. Indeed as you note, the combination of Using |
@mfsiega-airbyte I found that both For After both methods were patched I could setup a connection succesfully. |
@hugozap thanks for the pointer! I put up a PR for For |
@mfsiega-airbyte Sorry I was not clear. With I fixed it locally by adding ACTOR_CATALOG_FETCH_EVENT.id.desc() to the ORDER BY to break the tie and limit(1) will work again. To test that it fails, you would have to simulate the current write bug by inserting a duplicate record (same actor id and create date). |
Environment
Current Behavior
After setting up a connector (unfortunately I don't have the logs nor remember which connector it was), I started getting the following error in the ui:
Expected Behavior
The ui behave as expected.
Logs
The only logs I can get are the ones posted on a slack thread.
Steps to Reproduce
actor catalog fetch event
with the same timestamp. Not sure why it happened.I already did some troubleshooting here and know the reason for the error. Since there are 2 rows with same timestamp, this should be using
row_number()
instead ofrank()
. Since I am new to airbyte, I am not able to inform whether the code that inserts into the table should be fixed so we do not have 2 rows with same timestamp for sameactor
or simply fixing the query I mentioned above is enough.Are you willing to submit a PR?
No
The text was updated successfully, but these errors were encountered: