-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to access stored field and scroll_id #299
Comments
Did you try to set the |
@cphaarmeyer thank you! Specifying parameter time_scroll for Search() was sufficient to access _scroll_id. Therefore, the working code looks like this:
Do you have any thoughts on how to pull stored field? |
Do you mean something like this?
|
Indeed, code you propose is somehow suggested in Search() documentation, and I also tried it (although not as a part of the body). However, it doesn't work, because "text" is a stored field, not a part of "_source" (see structure of the record I pasted as a part of my question). According to documentation, it should be pulled by specifying stored_fields parameter, but it is not the case. |
Oh sorry. Then I don't know. I have never seen such a setup. |
Hello, I am having difficulties to pull stored field and scroll_id.
Stored field:
The field is called "text" and in Kibana I can see it is present for the index "document-000002". When specifying "text" as a value for parameter stored_fields I don't get it pulled, instead only "_index", "_type", "_id", "_score" and "_source" are present in the resulting list (first two lines of code). When I tested the line with source parameter, element "_source" was an empty list.
An exemplary record from ES, accessed via Kibana:
Tried code:
scroll_id
I would like to use scroll parameter to pull more than the default 10K documents for the same index. I see it should be possible, because:
total hits amount to more than 8 millions. However, scroll ID is always NULL
I will appreciate any help.
ES version in use: 7.3.1 Elastic package version in use: 1.2.0
The text was updated successfully, but these errors were encountered: