Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix flaky TestLanguageServerMultipleFiles #931

Closed
anderseknert opened this issue Jul 21, 2024 · 1 comment · Fixed by #933
Closed

Fix flaky TestLanguageServerMultipleFiles #931

anderseknert opened this issue Jul 21, 2024 · 1 comment · Fixed by #933
Labels
bug Something isn't working

Comments

@anderseknert
Copy link
Member

This is annoying, and a bad contributor experience, as it's nothing they've caused or can address. We should fix this ASAP.

   --- FAIL: TestLanguageServerMultipleFiles (4.01s)
      server_test.go:406: timed out waiting for authz.rego diagnostics to be sent
  ERROR: failed to update aggregate diagnostics (trigger): failed to lint: failed to lint using Rego aggregate rules: error encountered in query evaluation eval_cancel_error: caller cancelled query execution
  ERROR: failed to send diagnostic: failed to notify: jsonrpc2: connection is closed
  ERROR: failed to send diagnostic: failed to notify: jsonrpc2: connection is closed
  FAIL
@anderseknert anderseknert added the bug Something isn't working label Jul 21, 2024
charlieegan3 added a commit that referenced this issue Jul 22, 2024
@charlieegan3
Copy link
Member

I've done some digging here, it looks to me that this is (now, after some previous adjustments to this test) just a case of the timeouts being too aggressive to be reliable on GH actions.

I have been doing some testing today and found that in 300 runs the max time is around 1.8s to wait for a file's state to be consistent with the expected value.

I think that the fix here is going to be making the timeouts longer for now as I've not been able to replicate any locking behaviour that causes this test to fail in another wait other than the timeout.

I have increased the timeout in #933

charlieegan3 added a commit that referenced this issue Jul 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants