Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[NO MRG] Testing CI #1444

Closed
wants to merge 1 commit into from
Closed

Conversation

jakirkham
Copy link
Member

Description

An empty commit to test CI

Checklist

  • I am familiar with the Contributing Guidelines.
  • New or existing tests cover these changes.
  • The documentation is up to date with these changes.

@jakirkham jakirkham added bug Something isn't working non-breaking Non-breaking change labels Aug 24, 2024
@jakirkham
Copy link
Member Author

Seeing the following error in the wheel builds on CI. This is taken from one CI job. The others look similar

  CMake Error at /tmp/pip-build-env-p2tmag1j/normal/lib/python3.10/site-packages/libcudf/lib64/cmake/cudf/cudf-targets.cmake:61 (set_target_properties):
    The link interface of target "cudf::cudf" contains:

      Arrow::Arrow

    but the target was not found.  Possible reasons include:

      * There is a typo in the target name.
      * A find_package call is missing for an IMPORTED target.
      * An ALIAS target is missing.

  Call Stack (most recent call first):
    /tmp/pip-build-env-p2tmag1j/normal/lib/python3.10/site-packages/libcudf/lib64/cmake/cudf/cudf-config.cmake:91 (include)
    build/cp310-cp310-linux_aarch64/cmake/CPM_0.40.0.cmake:249 (find_package)
    build/cp310-cp310-linux_aarch64/cmake/CPM_0.40.0.cmake:303 (cpm_find_package)
    build/cp310-cp310-linux_aarch64/_deps/rapids-cmake-src/rapids-cmake/cpm/find.cmake:189 (CPMFindPackage)
    /__w/cuspatial/cuspatial/cpp/cmake/thirdparty/get_cudf.cmake:39 (rapids_cpm_find)
    /__w/cuspatial/cuspatial/cpp/cmake/thirdparty/get_cudf.cmake:72 (find_and_configure_cudf)
    /__w/cuspatial/cuspatial/cpp/CMakeLists.txt:107 (include)


  -- Generating done (0.0s)
  CMake Warning:
    Manually-specified variables were not used by the project:

      USE_LIBARROW_FROM_PYARROW


  CMake Generate step failed.  Build files cannot be regenerated correctly.

  *** CMake configuration failed
  error: subprocess-exited-with-error
  
  × Building wheel for cuspatial-cu12 (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> See above for output.
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  full command: /pyenv/versions/3.10.14/bin/python /pyenv/versions/3.10.14/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py build_wheel /tmp/tmpev9sqaup
  cwd: /__w/cuspatial/cuspatial/python/cuspatial
  Building wheel for cuspatial-cu12 (pyproject.toml): finished with status 'error'
  ERROR: Failed building wheel for cuspatial-cu12
Failed to build cuspatial-cu12

@jameslamb
Copy link
Member

I just restarted all CI here. I'd like to see if rapidsai/cudf#16640 being merged changes the failures we see here.

(I'm testing that locally too)

rapids-bot bot pushed a commit that referenced this pull request Aug 29, 2024
Contributes to rapidsai/build-planning#33.

Proposes the following for `cuspatial` wheels:

* add build and runtime dependencies on `libcudf` wheels
* stop vendoring copies of `libcudf.so`, `libnvcomp.so`, `libnvcomp_bitcomp.so`, and `libnvcomp_gdeflate.so`
  - *(load `libcudf.so` dynamically at runtime instead)*

And other related changes for development/CI:

* combine all `pip install` calls into 1 in wheel-testing scripts
  - *like rapidsai/cudf#16575
  - *to improve the chance that packaging issues are discovered in CI*
* `dependencies.yaml` changes:
   - more use of YAML anchors = less duplication
   - use dedicated `depends_on_librmm` and `depends_on_libcudf` groups
* explicitly pass a package type to `gha-tools` wheel uploading/downloading scripts

## Notes for Reviewers

### Benefits of these changes

Unblocks CI in this repo (ref: #1444 (comment), #1441 (comment)).

Reduces wheel sizes for `cuspatial` wheels by about 125MB 😁 

| wheel          | size (before)  | size (this PR) |
|:-----------:|-------------:|---------------:|
| `cuspatial` |   146.0M        |   21M               |
| `cuproj `     |       0.9M       |   0.9M              |
|**TOTAL**   |  **146.9M** | **21.9M**        |

*NOTES: size = compressed, "before" = 2024-08-21 nightlies (c60bd4d), CUDA = 12, Python = 3.11*

<details><summary>how I calculated those (click me)</summary>

```shell
# note: 2024-08-21 because that was the most recent date with
#           successfully-built cuspatial nightlies
#
docker run \
    --rm \
    -v $(pwd):/opt/work:ro \
    -w /opt/work \
    --network host \
    --env RAPIDS_NIGHTLY_DATE=2024-08-21 \
    --env RAPIDS_NIGHTLY_SHA=c60bd4d \
    --env RAPIDS_PR_NUMBER=1447 \
    --env RAPIDS_PY_CUDA_SUFFIX=cu12 \
    --env RAPIDS_REPOSITORY=rapidsai/cuspatial \
    --env WHEEL_DIR_BEFORE=/tmp/wheels-before \
    --env WHEEL_DIR_AFTER=/tmp/wheels-after \
    -it rapidsai/ci-wheel:cuda12.5.1-rockylinux8-py3.11 \
    bash

mkdir -p "${WHEEL_DIR_BEFORE}"
mkdir -p "${WHEEL_DIR_AFTER}"

py_projects=(
    cuspatial
    cuproj
)

for project in "${py_projects[@]}"; do
    # before
    RAPIDS_BUILD_TYPE=nightly \
    RAPIDS_PY_WHEEL_NAME="${project}_${RAPIDS_PY_CUDA_SUFFIX}" \
    RAPIDS_REF_NAME="branch-24.10" \
    RAPIDS_SHA=${RAPIDS_NIGHTLY_SHA} \
        rapids-download-wheels-from-s3 python "${WHEEL_DIR_BEFORE}"

    # after
    RAPIDS_BUILD_TYPE=pull-request \
    RAPIDS_PY_WHEEL_NAME="${project}_${RAPIDS_PY_CUDA_SUFFIX}" \
    RAPIDS_REF_NAME="pull-request/${RAPIDS_PR_NUMBER}" \
        rapids-download-wheels-from-s3 python "${WHEEL_DIR_AFTER}"
done

du -sh ${WHEEL_DIR_BEFORE}/*
du -sh ${WHEEL_DIR_BEFORE}
du -sh ${WHEEL_DIR_AFTER}/*
du -sh ${WHEEL_DIR_AFTER}
```

</details>

Reduces the amount of additional work required to start shipping `libcuspatial` wheels.

### Background

This is part of ongoing work towards packaging `libcuspatial` as a wheel.

relevant prior work:

* packaging `libcudf` wheels: rapidsai/cudf#15483
* consolidating `pip install` calls in CI scripts for `cudf`: rapidsai/cudf#16575
* `cudf` dropping its Arrow library dependency: rapidsai/cudf#16640

### How I tested this

Confirmed in local builds and CI logs that `cudf` is being *found*, not *built*, in `cuspatial` builds.

```text
-- CPM: Using local package [email protected]
```

([build link](https://github.com/rapidsai/cuspatial/actions/runs/10602971716/job/29386288614?pr=1447#step:9:23472))

Built `cuspatial` wheels locally and ran all the unit tests, without issue.

#

Authors:
  - James Lamb (https://github.com/jameslamb)

Approvers:
  - Bradley Dice (https://github.com/bdice)
  - Vyas Ramasubramani (https://github.com/vyasr)
  - Matthew Roeschke (https://github.com/mroeschke)

URL: #1447
@jakirkham
Copy link
Member Author

Whenever you are happy with things here, please feel free to close

@jameslamb
Copy link
Member

yep think things are good now, thanks!

@jameslamb jameslamb closed this Aug 30, 2024
@jakirkham jakirkham deleted the tst_ci branch August 30, 2024 19:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working non-breaking Non-breaking change
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

2 participants