-
-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Don't hang the request while processing the CSV #6
Comments
Relevant code comment: datasette-upload-csvs/datasette_upload_csvs/app.py Lines 25 to 34 in 54aa31f
|
I think the easiest way to do this is with the new Datasette write queue mechanism. It's not ideal for long operations but I think it's OK in this particular case to lock the database write connection for the duration of processing the CSV. What is really like is to be able to poll for progress with the conversion and show another progress bar. Problem there is that I don't know how many rows total are going to be imported - at least not without parking the CSV file through a first time just to count rows. But... maybe I can use the file size trick? If I know the size of the file in bytes I can have the CSV reader work against a wrapper class which counts the number of bytes that have been read. I would still need a mechanism for the write thread to make the progress available to be polled. How about it writes to a temporary table every few thousand bytes or so? |
I would have to be committing as I go, but I want to do that with the rows I'm inserting anyway. |
I'm feeling pretty good about this approach - opening a pull request. |
Fixed by #7 |
The gif in #5 demonstrates how a large CSV upload results in a request that hangs for a while.
Starlette background tasks might help here: https://www.starlette.io/background/ - suggestion from encode/starlette#697 (comment)
The text was updated successfully, but these errors were encountered: