Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FAILURE: BUILD AN OPENSHIFT-ANSIBLE RELEASE #14122

Closed
sdodson opened this issue May 10, 2017 · 16 comments
Closed

FAILURE: BUILD AN OPENSHIFT-ANSIBLE RELEASE #14122

sdodson opened this issue May 10, 2017 · 16 comments
Assignees
Labels
component/internal-tools kind/test-flake Categorizes issue or PR as related to test flakes. priority/P1

Comments

@sdodson
Copy link
Member

sdodson commented May 10, 2017

Seen in https://ci.openshift.redhat.com/jenkins/job/test_pull_request_openshift_ansible_extended_conformance_install_with_status_check/260/consoleFull#-157296560358b6e51eb7608a5981914356

I think this happened because the tag is on the stage branch but this PR is against master?

########## STARTING STAGE: BUILD AN OPENSHIFT-ANSIBLE RELEASE ##########
+ [[ -s /var/lib/jenkins/jobs/test_pull_request_openshift_ansible_extended_conformance_install_with_status_check/workspace/activate ]]
+ source /var/lib/jenkins/jobs/test_pull_request_openshift_ansible_extended_conformance_install_with_status_check/workspace/activate
++ export VIRTUAL_ENV=/var/lib/jenkins/origin-ci-tool/b58d70e4ba8547fb716da04deb467ce17ccf4345
++ VIRTUAL_ENV=/var/lib/jenkins/origin-ci-tool/b58d70e4ba8547fb716da04deb467ce17ccf4345
++ export PATH=/var/lib/jenkins/origin-ci-tool/b58d70e4ba8547fb716da04deb467ce17ccf4345/bin:/sbin:/usr/sbin:/bin:/usr/bin
++ PATH=/var/lib/jenkins/origin-ci-tool/b58d70e4ba8547fb716da04deb467ce17ccf4345/bin:/sbin:/usr/sbin:/bin:/usr/bin
++ unset PYTHON_HOME
++ export OCT_CONFIG_HOME=/var/lib/jenkins/jobs/test_pull_request_openshift_ansible_extended_conformance_install_with_status_check/workspace/.config
++ OCT_CONFIG_HOME=/var/lib/jenkins/jobs/test_pull_request_openshift_ansible_extended_conformance_install_with_status_check/workspace/.config
++ mktemp
+ script=/tmp/tmp.SXCuZksfW9
+ cat
+ chmod +x /tmp/tmp.SXCuZksfW9
+ scp -F ./.config/origin-ci-tool/inventory/.ssh_config /tmp/tmp.SXCuZksfW9 openshiftdevel:/tmp/tmp.SXCuZksfW9
+ ssh -F ./.config/origin-ci-tool/inventory/.ssh_config -t openshiftdevel 'bash -l -c "/tmp/tmp.SXCuZksfW9"'
+ cd /data/src/github.com/openshift/openshift-ansible
+ tito_tmp_dir=tito
+ mkdir -p tito
+ tito tag --offline --accept-auto-changelog
Creating output directory: /tmp/tito
Tagging new version of openshift-ansible: 3.6.62-1 -> 3.6.63-1
Traceback (most recent call last):
  File "/usr/bin/tito", line 23, in <module>
    CLI().main(sys.argv[1:])
  File "/usr/lib/python2.7/site-packages/tito/cli.py", line 202, in main
    return module.main(argv)
  File "/usr/lib/python2.7/site-packages/tito/cli.py", line 666, in main
    return tagger.run(self.options)
  File "/usr/lib/python2.7/site-packages/tito/tagger/main.py", line 114, in run
    self._tag_release()
  File "/usr/lib/python2.7/site-packages/tito/tagger/main.py", line 136, in _tag_release
    self._check_tag_does_not_exist(self._get_new_tag(new_version))
  File "/usr/lib/python2.7/site-packages/tito/tagger/main.py", line 501, in _check_tag_does_not_exist
    raise Exception("Tag %s already exists!" % new_tag)
Exception: Tag openshift-ansible-3.6.63-1 already exists!
++ export status=FAILURE
++ status=FAILURE
+ set +o xtrace
########## FINISHED STAGE: FAILURE: BUILD AN OPENSHIFT-ANSIBLE RELEASE ##########
@sdodson
Copy link
Member Author

sdodson commented May 10, 2017

I think I've fixed this by tagging stage 3.6.65.0 and master 3.6.66

@stevekuznetsov
Copy link
Contributor

Yeah, we need to be using that four-number workaround until we have a better solution, but this isn't necessarily a tooling or testing issue -- we just need to have a method for making unique tags.

@sdodson
Copy link
Member Author

sdodson commented May 10, 2017

Well, we should use tools to handle the stage branching, so just making sure you guys are aware of that and we add .0 to the current version while branching and tagging stage.

@stevekuznetsov
Copy link
Contributor

Is this problem solved? Are the people reporting this above doing so from old failures? Do we need to take action right now?

@sdodson
Copy link
Member Author

sdodson commented May 10, 2017

Those are old failures. I fixed it at

commit 5bd4ca446711fc7faa53ad0d6d3f2183857938d1
Author: Scott Dodson <[email protected]>
Date:   Wed May 10 08:50:14 2017 -0400

    Automatic commit of package [openshift-ansible] release [3.6.66-1].
    
    Created by command:
    
    /usr/bin/tito tag --keep-version

@tnozicka
Copy link
Contributor

@mfojtik
Copy link
Contributor

mfojtik commented May 31, 2017

Note that this happened exactly when the new tag was cut for OSE and the versions in error matches, so maybe just bad timing.

@tnozicka
Copy link
Contributor

#14322 hit it again - https://ci.openshift.redhat.com/jenkins/job/merge_pull_request_origin/853/

This seem to be serious :)

@sdodson
Copy link
Member Author

sdodson commented May 31, 2017

On stage branch set rpm specfile version to 3.6.88.0, tag and pushed to github.
On master branch set rpm specfile version to 3.6.89, tag and pushed to github.

We need to fix the stage branch tooling to append a fourth version segment when forking that branch.

Looks like @jupierce built the jobs for that, assigning to him.

@jupierce
Copy link
Contributor

@sdodson I think the only patch that will hold up through stagecut is adding a .0 to the openshift-ansible.spec in master. A change in the stage branch will be wiped out every time we build stage (since the o-a spec version is always replaced by the ose spec version). When stagecut ends and we return to master, the .0 will similarly be wiped out.
Discussing longer term solutions with @stevekuznetsov to determine the best route.

@jupierce
Copy link
Contributor

Should be fixed with openshift-eng/aos-cd-jobs@ca7f5b2

@enj enj reopened this Jun 24, 2017
@enj
Copy link
Contributor

enj commented Jun 24, 2017

Seen in #14853 https://ci.openshift.redhat.com/jenkins/job/merge_pull_request_origin/1111/

########## FINISHED STAGE: SUCCESS: DETERMINE THE RELEASE COMMIT FOR ORIGIN IMAGES [00h 00m 14s] ##########
[workspace] $ /bin/bash /tmp/hudson1315612192578537421.sh
########## STARTING STAGE: BUILD AN OPENSHIFT-ANSIBLE RELEASE ##########
+ [[ -s /var/lib/jenkins/jobs/test_pull_request_origin_extended_conformance_install_update/workspace/activate ]]
+ source /var/lib/jenkins/jobs/test_pull_request_origin_extended_conformance_install_update/workspace/activate
++ export VIRTUAL_ENV=/var/lib/jenkins/origin-ci-tool/9aea3b4f81e266b026e21975a3a6a5a1cfddd890
++ VIRTUAL_ENV=/var/lib/jenkins/origin-ci-tool/9aea3b4f81e266b026e21975a3a6a5a1cfddd890
++ export PATH=/var/lib/jenkins/origin-ci-tool/9aea3b4f81e266b026e21975a3a6a5a1cfddd890/bin:/sbin:/usr/sbin:/bin:/usr/bin
++ PATH=/var/lib/jenkins/origin-ci-tool/9aea3b4f81e266b026e21975a3a6a5a1cfddd890/bin:/sbin:/usr/sbin:/bin:/usr/bin
++ unset PYTHON_HOME
++ export OCT_CONFIG_HOME=/var/lib/jenkins/jobs/test_pull_request_origin_extended_conformance_install_update/workspace/.config
++ OCT_CONFIG_HOME=/var/lib/jenkins/jobs/test_pull_request_origin_extended_conformance_install_update/workspace/.config
++ mktemp
+ script=/tmp/tmp.4YCB5OYOuv
+ cat
+ chmod +x /tmp/tmp.4YCB5OYOuv
+ scp -F ./.config/origin-ci-tool/inventory/.ssh_config /tmp/tmp.4YCB5OYOuv openshiftdevel:/tmp/tmp.4YCB5OYOuv
+ ssh -F ./.config/origin-ci-tool/inventory/.ssh_config -t openshiftdevel 'bash -l -c "/tmp/tmp.4YCB5OYOuv"'
+ cd /data/src/github.com/openshift/openshift-ansible
+ tito_tmp_dir=tito
+ mkdir -p tito
+ tito tag --offline --accept-auto-changelog
Creating output directory: /tmp/tito
Tagging new version of openshift-ansible: 3.6.123-1 -> 3.6.124-1
Traceback (most recent call last):
  File "/usr/bin/tito", line 23, in <module>
    CLI().main(sys.argv[1:])
  File "/usr/lib/python2.7/site-packages/tito/cli.py", line 202, in main
    return module.main(argv)
  File "/usr/lib/python2.7/site-packages/tito/cli.py", line 666, in main
    return tagger.run(self.options)
  File "/usr/lib/python2.7/site-packages/tito/tagger/main.py", line 114, in run
    self._tag_release()
  File "/usr/lib/python2.7/site-packages/tito/tagger/main.py", line 136, in _tag_release
    self._check_tag_does_not_exist(self._get_new_tag(new_version))
  File "/usr/lib/python2.7/site-packages/tito/tagger/main.py", line 501, in _check_tag_does_not_exist
    raise Exception("Tag %s already exists!" % new_tag)
Exception: Tag openshift-ansible-3.6.124-1 already exists!
++ export status=FAILURE
++ status=FAILURE
+ set +o xtrace
########## FINISHED STAGE: FAILURE: BUILD AN OPENSHIFT-ANSIBLE RELEASE [00h 00m 02s] ##########
Build step 'Execute shell' marked build as failure
[PostBuildScript] - Execution post build scripts.
[workspace] $ /bin/bash /tmp/hudson2579696756842582650.sh

@jupierce
Copy link
Contributor

Fixed again. The present workaround relies on careful management of Jenkins job scheduling. Working on a more robust solution.

@jupierce
Copy link
Contributor

@stevekuznetsov I think I asked this before during our discussion, but I'll lay it out there again. Addressing this completely outside of CI is tricky. I think a simple and effective approach would be for CI to sed append .1000 to any version it found in openshift-ansible.spec before doing tito tag. This is effectively what has to happen outside of CI to avoid this problem, but performing it outside of CI involves git commits and adds complexity to entering and exiting stagecut.

@stevekuznetsov
Copy link
Contributor

Not the cleanest thing in the world but we can try that on Monday.

@jupierce
Copy link
Contributor

We should now have a reliable fix in place:

  • master branches will build with three fields (3.6.X)
  • stage branches will build with four fields (3.6.X.Y). openshift/openshift-ansible@f9c47bf
  • enterprise branches will build will five fields (3.6.X.Y.Z)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component/internal-tools kind/test-flake Categorizes issue or PR as related to test flakes. priority/P1
Projects
None yet
Development

No branches or pull requests

7 participants