Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Python doc. [skip ci] #5517

Merged
merged 3 commits into from
Apr 14, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 16 additions & 7 deletions python-package/xgboost/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -1048,6 +1048,9 @@ class DeviceQuantileDMatrix(DMatrix):
quantisation.

You can construct DeviceQuantileDMatrix from cupy/cudf/dlpack.

.. versionadded:: 1.1.0

"""

def __init__(self, data, label=None, weight=None, base_margin=None,
Expand Down Expand Up @@ -1197,7 +1200,10 @@ def __setstate__(self, state):

def save_config(self):
'''Output internal parameter configuration of Booster as a JSON
string.'''
string.

.. versionadded:: 1.0.0
'''
json_string = ctypes.c_char_p()
length = c_bst_ulong()
_check_call(_LIB.XGBoosterSaveJsonConfig(
Expand All @@ -1208,7 +1214,10 @@ def save_config(self):
return json_string

def load_config(self, config):
'''Load configuration returned by `save_config`.'''
'''Load configuration returned by `save_config`.

.. versionadded:: 1.0.0
'''
assert isinstance(config, str)
_check_call(_LIB.XGBoosterLoadJsonConfig(
self.handle,
Expand All @@ -1218,11 +1227,7 @@ def __copy__(self):
return self.__deepcopy__(None)

def __deepcopy__(self, _):
'''Return a copy of booster. Caches for DMatrix are not copied so continue
training on copied booster will result in lower performance and
slightly different result.

'''
'''Return a copy of booster.'''
return Booster(model_file=self)

def copy(self):
Expand Down Expand Up @@ -1517,6 +1522,8 @@ def predict(self,
Whether the prediction value is used for training. This can effect
`dart` booster, which performs dropouts during training iterations.

.. versionadded:: 1.0.0

.. note:: Using ``predict()`` with DART booster

If the booster object is DART type, ``predict()`` will not perform
Expand Down Expand Up @@ -1601,6 +1608,8 @@ def inplace_predict(self, data, iteration_range=(0, 0),
booster.set_param({'predictor': 'cpu_predictor})
booster.inplace_predict(numpy_array)

.. versionadded:: 1.1.0

Parameters
----------
data : numpy.ndarray/scipy.sparse.csr_matrix/cupy.ndarray/
Expand Down
8 changes: 8 additions & 0 deletions python-package/xgboost/dask.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,8 @@ class DaskDMatrix:
the input data explicitly if you want to see actual computation of
constructing `DaskDMatrix`.

.. versionadded:: 1.0.0

Parameters
----------
client: dask.distributed.Client
Expand Down Expand Up @@ -368,6 +370,8 @@ def _get_rabit_args(worker_map, client):
def train(client, params, dtrain, *args, evals=(), **kwargs):
'''Train XGBoost model.

.. versionadded:: 1.0.0

Parameters
----------
client: dask.distributed.Client
Expand Down Expand Up @@ -459,6 +463,8 @@ def predict(client, model, data, *args, missing=numpy.nan):

Only default prediction mode is supported right now.

.. versionadded:: 1.0.0

Parameters
----------
client: dask.distributed.Client
Expand Down Expand Up @@ -582,6 +588,8 @@ def inplace_predict(client, model, data,
missing=numpy.nan):
'''Inplace prediction.

.. versionadded:: 1.1.0

Parameters
----------
client: dask.distributed.Client
Expand Down