Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't hang the request while processing the CSV #6

Closed
simonw opened this issue Feb 28, 2020 · 5 comments
Closed

Don't hang the request while processing the CSV #6

simonw opened this issue Feb 28, 2020 · 5 comments
Labels
enhancement New feature or request

Comments

@simonw
Copy link
Owner

simonw commented Feb 28, 2020

The gif in #5 demonstrates how a large CSV upload results in a request that hangs for a while.

Starlette background tasks might help here: https://www.starlette.io/background/ - suggestion from encode/starlette#697 (comment)

@simonw simonw added the enhancement New feature or request label Feb 28, 2020
@simonw
Copy link
Owner Author

simonw commented Feb 28, 2020

Relevant code comment:

async def post(self, request):
formdata = await request.form()
csv = formdata["csv"]
# csv.file is a SpooledTemporaryFile, I can read it directly
# NOTE: this is blocking - a better implementation would run this
# in a thread.
filename = csv.filename
# TODO: Support other encodings:
reader = csv_std.reader(codecs.iterdecode(csv.file, "utf-8"))
headers = next(reader)

@simonw
Copy link
Owner Author

simonw commented Feb 29, 2020

I think the easiest way to do this is with the new Datasette write queue mechanism. It's not ideal for long operations but I think it's OK in this particular case to lock the database write connection for the duration of processing the CSV.

What is really like is to be able to poll for progress with the conversion and show another progress bar.

Problem there is that I don't know how many rows total are going to be imported - at least not without parking the CSV file through a first time just to count rows.

But... maybe I can use the file size trick? If I know the size of the file in bytes I can have the CSV reader work against a wrapper class which counts the number of bytes that have been read.

I would still need a mechanism for the write thread to make the progress available to be polled. How about it writes to a temporary table every few thousand bytes or so?

@simonw
Copy link
Owner Author

simonw commented Feb 29, 2020

I would have to be committing as I go, but I want to do that with the rows I'm inserting anyway.

@simonw
Copy link
Owner Author

simonw commented Feb 29, 2020

I'm feeling pretty good about this approach - opening a pull request.

simonw added a commit that referenced this issue Mar 3, 2020
* WIP progress bars for CSV processing, refs #6 and #4

* Initial work on progress bar and drag/drop

* Test for progress bar table

* Write progress record to DB earlier

* Implemented server-side processing progress bar

* Use correct database path for Ajax polling

* Depend on Datasette 0.37.1
@simonw
Copy link
Owner Author

simonw commented Mar 3, 2020

Fixed by #7

@simonw simonw closed this as completed Mar 3, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant