Skip to content
This repository has been archived by the owner on Dec 27, 2019. It is now read-only.

read and chunk large text column #44

Open
poyaz opened this issue Aug 1, 2018 · 2 comments
Open

read and chunk large text column #44

poyaz opened this issue Aug 1, 2018 · 2 comments

Comments

@poyaz
Copy link

poyaz commented Aug 1, 2018

hi, i create table with keys, body column and the body column have large text data and i wanna read all text value of body

Column Type Value
keys numeric[] array[1, 2]
body centered abcd... (20Mb)

now can split value of body to record OR read all value of body in chunk?

pg-query-stream just chunk record by record can chunk value of column?

@poyaz poyaz changed the title read large text column read and chunk large text column Aug 1, 2018
@polesen
Copy link

polesen commented Oct 9, 2018

This looks like the same I am looking for.

I guess this project - node-pg-query-stream - is about streaming large amounts of rows without exhausting mem, but we also need a way to stream a large bytea column value from a single row.

Given the above example, I am thinking something in the lines of support for select body from t and then have the query result be a stream type in node for each row.body.

Is this already possible somehow?

@polesen
Copy link

polesen commented Oct 9, 2018

And likewise, we also need a way to stream data into a column when inserting. Currently, I am passing in a Buffer, but would be nice to be able to provide a stream, which was then streamed to the server.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants