-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Package dependency troubles with python 3.11
#815
Comments
Thanks for posting here |
Hey @aaronspring, I think I'm running into the same issues as well with the conda package for What's interesting though is that older versions of I can see in the recipe that the more recent versions added a few new dependencies:
My workaround for the time being is to install everything I need for Would it make sense to try publishing a new release/build to see what the problem might be? |
a new conda release, see conda-forge/climpred-feedstock#23 any more suggestions what to change are welcome |
With my newfound maintainer powers, I just released an updated @gmacgilchrist Can you verify that the latest build works for you? Let us know here! |
see conda-forge/climpred-feedstock#26 and #832 hopefully resolving |
Is this now closed by conda-forge/climpred-feedstock#27? @gmacgilchrist |
Sorry for the delay in checking this. This works with the new build. Thank you. |
It seems as though dependencies of
climpred
version2.3.0
are incompatible with the latest version ofpython
(3.11
). This appears to be rooted in the fact thatnumba
requirespython<=3.10
. This can cause problems if installingclimpred
in an environment that has pinnedpython=3.11
or installing directly from anenvironment.yml
file in which no versions are specified andpython=3.11
is picked up by default. In those cases,climpred
reverts to version1.1.0
!The only approach that I found that worked was to set up a completely clean environment, and a call to install only
climpred
, i.e.Some things that failed that might help pin down the dependency issues
An install with
python=3.9
installs on old version ofnumba
(0.53.1
), as well as other older dependencies. This is likewise true for an install that attempts to forceclimpred=2.3.0
. Importingclimpred
with these environments fails with the following Traceback, which again points to difficulties withnumba
:File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/climpred/init.py:4
1 # flake8: noqa
2 from pkg_resources import DistributionNotFound, get_distribution
----> 4 from . import (
5 bias_removal,
6 bootstrap,
7 comparisons,
8 constants,
9 exceptions,
10 graphics,
11 horizon,
12 metrics,
13 prediction,
14 relative_entropy,
15 smoothing,
16 stats,
17 testing,
18 tutorial,
19 )
20 from .classes import HindcastEnsemble, PerfectModelEnsemble
21 from .options import set_options
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/climpred/bias_removal.py:9
6 import xarray as xr
8 from .constants import GROUPBY_SEASONALITIES
----> 9 from .metrics import Metric
10 from .options import OPTIONS
11 from .utils import (
12 convert_cftime_to_datetime_coords,
13 convert_time_index,
14 get_lead_cftime_shift_args,
15 shift_cftime_singular,
16 )
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/climpred/metrics.py:9
7 import xarray as xr
8 from scipy.stats import norm
----> 9 from xskillscore import (
10 Contingency,
11 brier_score,
12 crps_ensemble,
13 crps_gaussian,
14 crps_quadrature,
15 discrimination,
16 effective_sample_size,
17 mae,
18 mape,
19 median_absolute_error,
20 mse,
21 pearson_r,
22 pearson_r_eff_p_value,
23 pearson_r_p_value,
24 rank_histogram,
25 reliability,
26 rmse,
27 roc,
28 rps,
29 smape,
30 spearman_r,
31 spearman_r_eff_p_value,
32 spearman_r_p_value,
33 threshold_brier_score,
34 )
36 from .constants import CLIMPRED_DIMS
38 dimType = Optional[Union[str, List[str]]]
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/xskillscore/init.py:5
2 from pkg_resources import DistributionNotFound, get_distribution
4 from .core import resampling
----> 5 from .core.accessor import XSkillScoreAccessor
6 from .core.comparative import halfwidth_ci_test, sign_test
7 from .core.contingency import Contingency
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/xskillscore/core/accessor.py:3
1 import xarray as xr
----> 3 from .deterministic import (
4 effective_sample_size,
5 linslope,
6 mae,
7 mape,
8 me,
9 median_absolute_error,
10 mse,
11 pearson_r,
12 pearson_r_eff_p_value,
13 pearson_r_p_value,
14 r2,
15 rmse,
16 smape,
17 spearman_r,
18 spearman_r_eff_p_value,
19 spearman_r_p_value,
20 )
21 from .probabilistic import (
22 brier_score,
23 crps_ensemble,
(...)
31 threshold_brier_score,
32 )
35 @xr.register_dataset_accessor("xs")
36 class XSkillScoreAccessor(object):
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/xskillscore/core/deterministic.py:5
1 import warnings
3 import xarray as xr
----> 5 from .np_deterministic import (
6 _effective_sample_size,
7 _linslope,
8 _mae,
9 _mape,
10 _me,
11 _median_absolute_error,
12 _mse,
13 _pearson_r,
14 _pearson_r_eff_p_value,
15 _pearson_r_p_value,
16 _r2,
17 _rmse,
18 _smape,
19 _spearman_r,
20 _spearman_r_eff_p_value,
21 _spearman_r_p_value,
22 )
23 from .utils import (
24 _fail_if_dim_empty,
25 _preprocess_dims,
26 _preprocess_weights,
27 _stack_input_if_needed,
28 )
30 all = [
31 "effective_sample_size",
32 "linslope",
(...)
46 "spearman_r_p_value",
47 ]
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/xskillscore/core/np_deterministic.py:6
3 from scipy import special
4 from scipy.stats import distributions
----> 6 from .utils import suppress_warnings
8 all = [
9 "_effective_sample_size",
10 "_linslope",
(...)
24 "_spearman_r_p_value",
25 ]
28 def _match_nans(a, b, weights):
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/xskillscore/core/utils.py:6
4 import numpy as np
5 import xarray as xr
----> 6 from xhistogram.xarray import histogram as xhist
8 all = ["histogram"]
11 @contextlib.contextmanager
12 def suppress_warnings(msg=None):
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/xhistogram/xarray.py:7
5 import xarray as xr
6 from collections import OrderedDict
----> 7 from .core import histogram as _histogram
9 # range is a keyword so save the builtin so they can use it.
10 _range = range
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/xhistogram/core.py:23
20 _range = range
22 try:
---> 23 import dask.array as dsa
25 has_dask = True
26 except ImportError:
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/dask/array/init.py:2
1 try:
----> 2 from dask.array import backends, fft, lib, linalg, ma, overlap, random
3 from dask.array.blockwise import atop, blockwise
4 from dask.array.chunk_types import register_chunk_type
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/dask/array/backends.py:6
3 import numpy as np
5 from dask.array import chunk
----> 6 from dask.array.core import Array
7 from dask.array.dispatch import (
8 concatenate_lookup,
9 divide_lookup,
(...)
17 to_numpy_dispatch,
18 )
19 from dask.array.numpy_compat import divide as np_divide
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/dask/array/core.py:36
34 from dask.array import chunk
35 from dask.array.chunk import getitem
---> 36 from dask.array.chunk_types import is_valid_array_chunk, is_valid_chunk_type
38 # Keep einsum_lookup and tensordot_lookup here for backwards compatibility
39 from dask.array.dispatch import ( # noqa: F401
40 concatenate_lookup,
41 einsum_lookup,
42 tensordot_lookup,
43 )
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/dask/array/chunk_types.py:122
119 pass
121 try:
--> 122 import sparse
124 register_chunk_type(sparse.SparseArray)
125 except ImportError:
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/sparse/init.py:1
----> 1 from ._coo import COO, as_coo
2 from ._compressed import GCXS
3 from ._dok import DOK
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/sparse/_coo/init.py:1
----> 1 from .core import COO, as_coo
2 from .common import (
3 concatenate,
4 clip,
(...)
22 diagonalize,
23 )
25 all = [
26 "COO",
27 "as_coo",
(...)
47 "diagonalize",
48 ]
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/sparse/_coo/core.py:9
6 import warnings
8 import numpy as np
----> 9 import numba
10 import scipy.sparse
11 from numpy.lib.mixins import NDArrayOperatorsMixin
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/numba/init.py:43
39 from numba.core.decorators import (cfunc, generated_jit, jit, njit, stencil,
40 jit_module)
42 # Re-export vectorize decorators and the thread layer querying function
---> 43 from numba.np.ufunc import (vectorize, guvectorize, threading_layer,
44 get_num_threads, set_num_threads)
46 # Re-export Numpy helpers
47 from numba.np.numpy_support import carray, farray, from_dtype
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/numba/np/ufunc/init.py:3
1 # -- coding: utf-8 --
----> 3 from numba.np.ufunc.decorators import Vectorize, GUVectorize, vectorize, guvectorize
4 from numba.np.ufunc._internal import PyUFunc_None, PyUFunc_Zero, PyUFunc_One
5 from numba.np.ufunc import _internal, array_exprs
File /nbhome/gam/miniconda3/envs/predict_climpred23/lib/python3.9/site-packages/numba/np/ufunc/decorators.py:3
1 import inspect
----> 3 from numba.np.ufunc import _internal
4 from numba.np.ufunc.parallel import ParallelUFuncBuilder, ParallelGUFuncBuilder
6 from numba.core.registry import TargetRegistry
SystemError: initialization of _internal failed without raising an exception
It is possible that an
environment.yml
file that specifiespython=3.10
would work. I didn't have time to try.I think I have now sorted this problem with a clean install, but I am posting here for reference. This is obviously an upstream issue, relating to
numpy
dependencies. However, it may be worthwhile putting some further information with the installation instructions that highlights these potential problems.Many thanks for eveeryone's work on this package - it's a tremendous resource.
The text was updated successfully, but these errors were encountered: