Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

0.22.0: pytest fails in two units #283

Open
2 tasks
kloczek opened this issue Sep 1, 2023 · 12 comments · Fixed by #284
Open
2 tasks

0.22.0: pytest fails in two units #283

kloczek opened this issue Sep 1, 2023 · 12 comments · Fixed by #284
Labels
area/general Related to whole service, not a specific part/integration. complexity/single-task Regular task, should be done within days. gain/low This doesn't bring that much value to users. impact/low This issue impacts only a few users. kind/bug Something isn't working.

Comments

@kloczek
Copy link

kloczek commented Sep 1, 2023

What happened? What is the problem?

Looks like pytest fails in two units.

What did you expect to happen?

pytest should not fail.

Example URL(s)

N/A

Steps to reproduce

I'm packaging your module as an rpm package so I'm using the typical PEP517 based build, install and test cycle used on building packages from non-root account.

  • python3 -sBm build -w --no-isolation
  • because I'm calling build with --no-isolation I'm using during all processes only locally installed modules
  • install .whl file in </install/prefix> using 'installer` module
  • run pytest with $PYTHONPATH pointing to sitearch and sitelib inside </install/prefix>
  • build is performed in env which is cut off from access to the public network (pytest is executed with -m "not network")

Here is pytest output:

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-specfile-0.22.0-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-specfile-0.22.0-2.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra -m 'not network'
==================================================================================== test session starts ====================================================================================
platform linux -- Python 3.8.18, pytest-7.4.0, pluggy-1.3.0
rootdir: /home/tkloczko/rpmbuild/BUILD/specfile-0.22.0
collected 272 items

tests/integration/test_specfile.py .......................................                                                                                                            [ 14%]
tests/performance/test_parse.py F                                                                                                                                                     [ 14%]
tests/unit/test_changelog.py ....................................                                                                                                                     [ 27%]
tests/unit/test_conditions.py ....                                                                                                                                                    [ 29%]
tests/unit/test_formatter.py ...........                                                                                                                                              [ 33%]
tests/unit/test_guess_packager.py .........                                                                                                                                           [ 36%]
tests/unit/test_macro_definitions.py .....                                                                                                                                            [ 38%]
tests/unit/test_macros.py ....F                                                                                                                                                       [ 40%]
tests/unit/test_options.py .....................                                                                                                                                      [ 48%]
tests/unit/test_prep.py ................                                                                                                                                              [ 54%]
tests/unit/test_sections.py ............                                                                                                                                              [ 58%]
tests/unit/test_sourcelist.py ...                                                                                                                                                     [ 59%]
tests/unit/test_sources.py ...............................................                                                                                                            [ 76%]
tests/unit/test_spec_parser.py ....                                                                                                                                                   [ 78%]
tests/unit/test_specfile.py ..........                                                                                                                                                [ 81%]
tests/unit/test_tags.py ....                                                                                                                                                          [ 83%]
tests/unit/test_utils.py ............................                                                                                                                                 [ 93%]
tests/unit/test_value_parser.py .................                                                                                                                                     [100%]

========================================================================================= FAILURES ==========================================================================================
__________________________________________________________________________________ test_parse_texlive_spec __________________________________________________________________________________

    @pytest.mark.fail_slow(30)
    def test_parse_texlive_spec():
>       spec = Specfile("/tmp/texlive.spec")

tests/performance/test_parse.py:11:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
specfile/specfile.py:67: in __init__
    self._lines = self._read_lines(self._path)
specfile/specfile.py:106: in _read_lines
    return path.read_text(encoding="utf8", errors="surrogateescape").splitlines()
/usr/lib64/python3.8/pathlib.py:1236: in read_text
    with self.open(mode='r', encoding=encoding, errors=errors) as f:
/usr/lib64/python3.8/pathlib.py:1222: in open
    return io.open(self, mode, buffering, encoding, errors, newline,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = PosixPath('/tmp/texlive.spec'), name = '/tmp/texlive.spec', flags = 524288, mode = 438

    def _opener(self, name, flags, mode=0o666):
        # A stub for the opener argument to built-in open()
>       return self._accessor.open(self, flags, mode)
E       FileNotFoundError: [Errno 2] No such file or directory: '/tmp/texlive.spec'

/usr/lib64/python3.8/pathlib.py:1078: FileNotFoundError
____________________________________________________________________________________ test_macros_reinit _____________________________________________________________________________________

    def test_macros_reinit():
        Macros.reinit(MacroLevel.BUILTIN)
>       assert all(m.level == MacroLevel.BUILTIN for m in Macros.dump())
E       assert False
E        +  where False = all(<generator object test_macros_reinit.<locals>.<genexpr> at 0x7f15ca04b200>)

tests/unit/test_macros.py:112: AssertionError
===================================================================================== warnings summary ======================================================================================
tests/performance/test_parse.py:9
  /home/tkloczko/rpmbuild/BUILD/specfile-0.22.0/tests/performance/test_parse.py:9: PytestUnknownMarkWarning: Unknown pytest.mark.fail_slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.fail_slow(30)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
================================================================================== short test summary info ==================================================================================
FAILED tests/performance/test_parse.py::test_parse_texlive_spec - FileNotFoundError: [Errno 2] No such file or directory: '/tmp/texlive.spec'
FAILED tests/unit/test_macros.py::test_macros_reinit - assert False
========================================================================= 2 failed, 270 passed, 1 warning in 4.04s ==========================================================================

Here is list of installed modules in build env

Package                    Version
-------------------------- -------
build                      0.10.0
distro                     1.8.0
exceptiongroup             1.1.3
flexmock                   0.11.3
gpg                        1.21.0
iniconfig                  2.0.0
installer                  0.7.0
libcomps                   0.1.19
packaging                  23.1
pkg                        0.0.0
pluggy                     1.3.0
pyproject_hooks            1.0.0
pytest                     7.4.0
python-dateutil            2.8.2
setuptools                 68.0.0
setuptools-scm             7.1.0
setuptools-scm-git-archive 1.4
six                        1.16.0
tomli                      2.0.1
typing_extensions          4.7.1
wheel                      0.41.1

Workaround

  • There is an existing workaround that can be used until this issue is fixed.

Participation

  • I am willing to submit a pull request for this issue. (Packit team is happy to help!)
@nforro
Copy link
Member

nforro commented Sep 1, 2023

I'm packaging your module as an rpm package

May I ask where/for what OS? It's already being packaged by our team for Fedora/EPEL: https://src.fedoraproject.org/rpms/python-specfile

You can fix the first failure by running pytest only on tests/unit and tests/integration (tests/performance are supposed to be run only as part of our CI), see:
https://github.com/packit/specfile/blob/e49e5e3ff191480046b4c509feacbc0973907d11/fedora/python-specfile.spec#L65C47-L65C47

As for the second failure, I can't really tell without reproducing it.

@kloczek
Copy link
Author

kloczek commented Sep 1, 2023

So if those directories only should be scanned it means that that pytest testpaths is not defined in for example pytest.ini as default 🤔

@kloczek
Copy link
Author

kloczek commented Sep 1, 2023

@nforro
Copy link
Member

nforro commented Sep 1, 2023

Ok, I can add that.

@majamassarini majamassarini added area/general Related to whole service, not a specific part/integration. kind/bug Something isn't working. complexity/single-task Regular task, should be done within days. gain/low This doesn't bring that much value to users. impact/low This issue impacts only a few users. labels Sep 4, 2023
softwarefactory-project-zuul bot added a commit that referenced this issue Sep 7, 2023
Remove deepdiff lock and set default testpaths

The deepdiff bug has been fixed: seperman/deepdiff@410019e
Testing dependencies only satisfy unit and integration tests, make them the default.
Partially fixes #283.

Reviewed-by: Matej Focko
@nforro nforro reopened this Sep 7, 2023
@nforro
Copy link
Member

nforro commented Sep 7, 2023

@kloczek Could you provide more info so I can try to reproduce the second failure?

@kloczek
Copy link
Author

kloczek commented Sep 7, 2023

@kloczek Could you provide more info so I can try to reproduce the second failure?

I think that I've provided all those details (because I've been using my template of the ticket with some pytest fails) 😋
Module .whl archive is build using pep517 based build procedure (using build module but you can use as well pip).
I've listed modules with versions installed in build env .. please tell me what exactly is still missing/not clear for you.

As input source I'm using source tree unpacked from tar ball generated from git tag.

BTW I'll try to test ASP already merged PR (will back with result shortly).

@kloczek
Copy link
Author

kloczek commented Sep 7, 2023

Just tested #284 and indeed one unit is OK now

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-specfile-0.22.0-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-specfile-0.22.0-2.fc35.x86_64/
usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra -m 'not network'
============================= test session starts ==============================
platform linux -- Python 3.8.18, pytest-7.4.1, pluggy-1.3.0
rootdir: /home/tkloczko/rpmbuild/BUILD/specfile-0.22.0
configfile: setup.cfg
testpaths: tests/unit, tests/integration
collected 271 items

tests/unit/test_changelog.py ....................................        [ 13%]
tests/unit/test_conditions.py ....                                       [ 14%]
tests/unit/test_formatter.py ...........                                 [ 18%]
tests/unit/test_guess_packager.py .........                              [ 22%]
tests/unit/test_macro_definitions.py .....                               [ 23%]
tests/unit/test_macros.py ....F                                          [ 25%]
tests/unit/test_options.py .....................                         [ 33%]
tests/unit/test_prep.py ................                                 [ 39%]
tests/unit/test_sections.py ............                                 [ 43%]
tests/unit/test_sourcelist.py ...                                        [ 45%]
tests/unit/test_sources.py ............................................. [ 61%]
..                                                                       [ 62%]
tests/unit/test_spec_parser.py ....                                      [ 63%]
tests/unit/test_specfile.py ..........                                   [ 67%]
tests/unit/test_tags.py ....                                             [ 69%]
tests/unit/test_utils.py ............................                    [ 79%]
tests/unit/test_value_parser.py .................                        [ 85%]
tests/integration/test_specfile.py ..................................... [ 99%]
..                                                                       [100%]

=================================== FAILURES ===================================
______________________________ test_macros_reinit ______________________________

    def test_macros_reinit():
        Macros.reinit(MacroLevel.BUILTIN)
>       assert all(m.level == MacroLevel.BUILTIN for m in Macros.dump())
E       assert False
E        +  where False = all(<generator object test_macros_reinit.<locals>.<genexpr> at 0x7f6d97271580>)

tests/unit/test_macros.py:112: AssertionError
=========================== short test summary info ============================
FAILED tests/unit/test_macros.py::test_macros_reinit - assert False
======================== 1 failed, 270 passed in 3.97s =========================

@nforro
Copy link
Member

nforro commented Sep 7, 2023

please tell me what exactly is still missing/not clear for you

Particularly RPM installation details.

But I'm curious, did the test pass before (with versions older than 0.22.0)? Could you run this code and post the output?

from specfile.macros import Macros, MacroLevel

Macros.reinit(MacroLevel.BUILTIN)
print([m for m in Macros.dump() if m.level != MacroLevel.BUILTIN])

@kloczek
Copy link
Author

kloczek commented Sep 7, 2023

Particularly RPM installation details.

Here is my spec file

# BUG: test suite is failing https://github.com/packit/specfile/issues/283
%bcond_with     failing_tests   # By default skip some failing test units

Summary:        A library for parsing and manipulating RPM spec files
Name:           python-specfile
Version:        0.22.0
Release:        2%{?dist}
License:        MIT (https://spdx.org/licenses/MIT.html)
URL:            https://pypi.org/project/specfile/
VCS:            https://github.com/packit/specfile/
Source:         %{VCS}/archive/%{version}/%{name}-%{version}.tar.gz
Patch:          %{VCS}/pull/284.patch#/%{name}-Remove-deepdiff-lock-and-set-default-testpaths.patch
BuildArch:      noarch
BuildRequires:  python3dist(build)
BuildRequires:  python3dist(installer)
BuildRequires:  python3dist(setuptools-scm)
BuildRequires:  python3dist(setuptools-scm-git-archive)
BuildRequires:  python3dist(wheel)
# ChcekRequires:
BuildRequires:  git-core
BuildRequires:  python3dist(flexmock)
BuildRequires:  python3dist(pytest)
Obsoletes:      python3-specfile

%description
Python library for parsing and manipulating RPM spec files. Main focus is on
modifying existing spec files, any change should result in a minimal diff.

%prep
%autosetup -p1 -n specfile-%{version}

%build
%pyproject_wheel

%install
%pyproject_install

%check
%pytest %{!?with_failing_tests: \
        --deselect tests/unit/test_macros.py::test_macros_reinit \
}

%files
%doc README.*
%{python3_sitelib}/specfile
%{python3_sitelib}/specfile-*.*-info

However without some some macros which I'm usimg above probably will not allow use you straight.
For example Fedora/RH is using pip on unpacking .whl into %{buildroot}. I'm using installer.
My %pytest macro look smuch more sophiosticated

[tkloczko@pers-jacek SPECS]$ rpm -E %pytest
\

ASMFLAGS="-O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none";
CFLAGS="-O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none";
CXXFLAGS="-O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none";
FFLAGS="-O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none -I/usr/lib64/gfortran/modules";
FCFLAGS="-O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none -I/usr/lib64/gfortran/modules";
LDFLAGS="-Wl,--gc-sections -Wl,--as-needed -flto=auto -flto-partition=none -fuse-linker-plugin -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -Wl,--build-id=sha1";
RUSTFLAGS="-C codegen-units=1 -C debuginfo=2 -C opt-level=2 -C link-arg=-fdata-sections -C link-arg=-ffunction-sections -C link-arg=-Wl,--as-needed -C link-arg=-Wl,-z,now -C link-arg=-Wl,-z,relro --cap-lints=warn" ;
VALAFLAGS="-g" ;
CC="/usr/bin/gcc"; CXX="/usr/bin/g++"; FC="/usr/bin/gfortran";
AR="/usr/bin/gcc-ar"; NM="/usr/bin/gcc-nm"; RANLIB="/usr/bin/gcc-ranlib";
export ASMFLAGS CFLAGS CXXFLAGS FFLAGS FCFLAGS LDFLAGS VALAFLAGS CC CXX FC AR NM RANLIB RUSTFLAGS VALAFLAGS;
 \
        PATH=/home/tkloczko/rpmbuild/BUILDROOT/%{NAME}-%{VERSION}-%{RELEASE}.x86_64/usr/bin:$PATH \
        LD_LIBRARY_PATH=/home/tkloczko/rpmbuild/BUILDROOT/%{NAME}-%{VERSION}-%{RELEASE}.x86_64/usr/lib64 \
        PYTHONDONTWRITEBYTECODE=1 \
        PDM_BUILD_SCM_VERSION=%{version} \
        PBR_VERSION=%{version} \
        SETUPTOOLS_SCM_PRETEND_VERSION=%{version} \
        PYTHONPATH=${PYTHONPATH:-/home/tkloczko/rpmbuild/BUILDROOT/%{NAME}-%{VERSION}-%{RELEASE}.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/%{NAME}-%{VERSION}-%{RELEASE}.x86_64/usr/lib/python3.8/site-packages} \
         \
        /usr/bin/pytest -ra -m "not network"

Nevertheless it should be possible to reproduce this fail without rpm.

But I'm curious, did the test pass before (with versions older than 0.22.0)? Could you run this code and post the output?

I had those two units in --deselect list in last few months and sorry I had no time to report that.
You may assume that at least in prev version those two units have been failing.

@kloczek
Copy link
Author

kloczek commented Mar 17, 2024

Just FTR: tested 0.28.0 and looks like pytest fails like before.

@kloczek
Copy link
Author

kloczek commented Mar 26, 2024

Tested 0.28.1 and still fails.

@nforro
Copy link
Member

nforro commented Jul 25, 2024

Sorry, I can't do anything about this without being able to reproduce it or at least having a dump of macros from the affected system.

Could you run this code and post the output?

from specfile.macros import Macros, MacroLevel

Macros.reinit(MacroLevel.BUILTIN)
print([m for m in Macros.dump() if m.level != MacroLevel.BUILTIN])

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/general Related to whole service, not a specific part/integration. complexity/single-task Regular task, should be done within days. gain/low This doesn't bring that much value to users. impact/low This issue impacts only a few users. kind/bug Something isn't working.
Projects
Status: backlog
Development

Successfully merging a pull request may close this issue.

3 participants