Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

query: Series API returns null data #1325

Closed
forestsword opened this issue Jul 12, 2019 · 2 comments · Fixed by #1327
Closed

query: Series API returns null data #1325

forestsword opened this issue Jul 12, 2019 · 2 comments · Fixed by #1327

Comments

@forestsword
Copy link

Related to, but not quite, #576

Thanos, Prometheus and Golang version used

  • Thanos improbable/thanos:v0.5.0
  • Prometheus prometheus/prometheus:v2.10.0

What happened?

I queried the series api for a metric that doesn't exist and the data field of the response was null:

# curl http://thanos-query.monitoring.svc.cluster.local:9090/api/v1/series?match%5B%5D=no_exist | jq .
{
  "status": "success",
  "data": null
}

What you expected to happen

I expected it to return an empty array like the Prometheus series api does:

# curl http://prometheus-k8s-a.monitoring.svc.cluster.local:9090/api/v1/series?match%5B%5D=no_exist | jq .
{
  "status": "success",
  "data": []
}

How to reproduce it (as minimally and precisely as possible):

Query the series api for a metric that does not exist.

Anything else we need to know

We came across this issue debugging the prometheus-adapter. We had recently switched the endpoint for it from Prometheus to the Thanos Querier. It began hitting this error. Presumably because Thanos is returning null instead of an empty array.

A little hunting and I found that the series function for the api can in fact return nil if there are no series (see here). The variable is never initialized as opposed to the Prometheus code here. I can't however understand how it's making it back out in the response.

This is probably not a big deal for most since usually you'll be querying for metrics that exist. But operating on a multi-tenant cluster and can't guarantee that the metrics we're using for the adapter will always be there. We're going to try a recording rule to fake the metric so we can continue with the switch to Thanos as a short term fix. I'd be happy to do a pull request but like I said couldn't find how it's getting into the response and not returning a no content code. A push in the right direction would be appreciated.

@GiedriusS
Copy link
Member

This should definitely be investigated & fixed since we want to have identical behaviour to Prometheus. It's probably related to the fact that SeriesResponse might not even have any series and it could contain only warnings. Thanks for the report!

GiedriusS added a commit to GiedriusS/thanos that referenced this issue Jul 13, 2019
Prometheus' /series end-point returns an empty array no matter what: if
there were results or not. We need to follow the same principle in
Thanos as well because our users depend on it.

The same fix has been applied like here:
https://github.com/prometheus/prometheus/blob/fef150f1b5e48652ec6779e2129ae9a0cf13db8a/web/api/v1/api.go#L506

Tests have been updated to account for this difference and plus one test
has been added just for this case.

Closes thanos-io#1325.
@GiedriusS
Copy link
Member

PTAL #1327.

brancz pushed a commit that referenced this issue Jul 15, 2019
Prometheus' /series end-point returns an empty array no matter what: if
there were results or not. We need to follow the same principle in
Thanos as well because our users depend on it.

The same fix has been applied like here:
https://github.com/prometheus/prometheus/blob/fef150f1b5e48652ec6779e2129ae9a0cf13db8a/web/api/v1/api.go#L506

Tests have been updated to account for this difference and plus one test
has been added just for this case.

Closes #1325.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants