-
-
Notifications
You must be signed in to change notification settings - Fork 517
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"getaddrinfo EAI_AGAIN" on CI #1218
Comments
Hey, @benjdlambert. Thanks for reporting this. I suspect the issue is related to the new interceptor we've introduced in Thank you for preparing the reproduction repository! I will look into it once I have a minute. |
Great - yep that sounds like the likely cause here. I think it's fair to assume it's been a bit of a head scratcher for us to try and work out why it only fails in some situations and not others, so any help would be much appreciated! Thanks! |
We are certainly not accounting for the |
I'm going to add the |
Nice! Sorry for the delay, i've been off for a few days. Seen that it's been released under 0.16, which doesn't meet the semver match in |
Hey, @benjdlambert. I'm currently working on updating the Interceptors dependency in MSW. There's been quite a few breaking changes, so it takes time to fully migrate. Cannot give any estimates as I approach this in my free time. |
No probs at all! Please let us know if there's anyway I can help out 🙏 |
Released: v0.42.0 🎉This has been released in v0.42.0! Make sure to always update to the latest version ( Predictable release automation by @ossjs/release. |
Prerequisites
Environment check
msw
versionNode.js version
14.x / 16.x
Reproduction repository
https://github.com/benjdlambert/msw-example-repo
Reproduction steps
Unable to reproduce this locally, think some form of race condition but it happens in our CI which is github actions running ubuntu-latest, this has been known to pass using macos-latest.
https://github.com/benjdlambert/msw-example-repo/runs/6130933244?check_suite_focus=true
Is a failure run, notice the logs how the mock is not stable under
run test
.Current behavior
We have recently upgraded from 0.35 to latest in this PR backstage/backstage#10589 and noticed that theres some failures for some tests. Been digging in this for the last few days to work out what is going on and wether it's something that we own that we could fix.
Looks like when setting up the mocks that the mock is not stable, and repeated requests to the mocked endpoint will sometimes yield the mocked data, and sometimes in our case it will try to make the request to the origin. Which for the purpose of the test case that I have reported is some unknown DNS and that's the error we get.
As you can see in the simple reproduction example that I have provided, we setup a mock, and then in the test we repeatedly call that mocked endpoint and get some DNS resolution errors which happen sporadically throughout the test run. This is of course an extreme example hitting it many times, but it happens when it's not called under load and causes flakey test failures.
Interestingly running the same tests under
macos-latest
as the agent in CI seemed to work. So not sure if some slowness or fastness is one of the runners is mitigating this, and maybe it's a race condition.Haven't had much time to dig into the source code, but it looks like this could be something that's been changed or added in the versions after 0.35 to now as our other test suite is stable.
Expected behavior
The
fetch
request returns the mock data all the time, rather than sometimes trying to bypass the mocked request and actually call out to the original origin.The text was updated successfully, but these errors were encountered: