Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IE11 webUI test suite hangs #31022

Closed
phil-davis opened this issue Apr 6, 2018 · 8 comments
Closed

IE11 webUI test suite hangs #31022

phil-davis opened this issue Apr 6, 2018 · 8 comments
Assignees

Comments

@phil-davis
Copy link
Contributor

IE11 webUI test suites run in the nightly Travis cron job on core stable10. Quite often (30% 40% of the time?) they fail with:

No output has been received in the last 10m0s, this potentially indicates a stalled build or something wrong with the build itself.

or

  Scenario: Select some files and delete from trashbin in a batch                    # /home/[secure]/build/owncloud/core/tests/acceptance/features/webUITrashbin/trashbinDelete.feature:34
    And the user has logged in with username "user1" and password "1234" using the webUI # WebUILoginContext::theUserLogsInWithUsernameAndPasswordUsingTheWebUI()
      exception 'WebDriver\Exception\CurlExec' with message 'Payload received from webdriver is not valid json: <html><body><h1>504 Gateway Time-out</h1>
      The server didn't respond in time.

Both of these seem like something in the chain of communication between behat, selenium, IE11 webdriver and the actual browser "goes wrong" and things "just stop happening".

Restarting the job usually gets a pass 2nd time. If not, then 3rd or 4th time is lucky. The actual tests have run fine when the job passes - so at least we are still knowing that all the scenarios do work on IE11.

It would be nice to find out (and fix!) what is the issue here, but don't hold your breath. The care factor from MS for maintaining IE11 webdriver etc is low to zero.

@ownclouders
Copy link
Contributor

GitMate.io thinks possibly related issues are #22492 (test), #22485 (test), #30897 (webUI test suite webUISharingExternal failing on Travis), #27315 (IE11 JS unit tests failures), and #30681 (Rename UI test suites).

@ownclouders
Copy link
Contributor

Hey, this issue has been closed because the label status/STALE is set and there were no updates for 7 days. Feel free to reopen this issue if you deem it appropriate.

(This is an automated comment from GitMate.io.)

@ownclouders
Copy link
Contributor

Hey, this issue has been closed because the label status/STALE is set and there were no updates for 7 days. Feel free to reopen this issue if you deem it appropriate.

(This is an automated comment from GitMate.io.)

@phil-davis
Copy link
Contributor Author

IMO this can stay closed. We need to separately sort out again the whole "running different browsers on Travis" thing, because it seems to have become unreliable again in the last month or so.

@patrickjahns
Copy link
Contributor

let's evaluate how we can get saucelabs to join the 🤖 force for drone and see how we can move forward

@phil-davis
Copy link
Contributor Author

Yes, in recent times when I tries to run jobs on Travis again, the ownCloud server would intermittently "go away" (stop responding). I very much suspect a PHP dev server segfault (which I get sometimes when I run the PHP dev server locally with both 7.1 and 7.2. I put the PHP dev server output to the Travis log, mixed in with the Behat output, but that did not help identify what went wrong. I probably have to have some script that goes looking for the core dump file...

If we get SauceLabs connection going from drone, then we will have the Apache server already in place. We will just need to think about how to control the 5-job SauceLabs limit.

@patrickjahns
Copy link
Contributor

We could temporarily use our docker container for testing. As long as all tests can be executed "remotely" this would work. We can have a look together

@lock
Copy link

lock bot commented Jul 30, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked as resolved and limited conversation to collaborators Jul 30, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants