Skip to content

Commit c6f5604

Browse files
authored
[doc] Move model slicing document for R. (dmlc#11316)
1 parent 8fb2468 commit c6f5604

File tree

3 files changed

+27
-5
lines changed

3 files changed

+27
-5
lines changed

doc/python/index.rst

-1
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,6 @@ Contents
1313
sklearn_estimator
1414
python_api
1515
callbacks
16-
model
1716
examples/index
1817
dask-examples/index
1918
survival-examples/index

doc/tutorials/index.rst

+1
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ See `Awesome XGBoost <https://github.com/dmlc/xgboost/tree/master/demo>`_ for mo
1111

1212
model
1313
saving_model
14+
slicing_model
1415
learning_to_rank
1516
dart
1617
monotonic

doc/python/model.rst renamed to doc/tutorials/slicing_model.rst

+26-4
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,15 @@
1-
#####
2-
Model
3-
#####
1+
##############
2+
Slicing Models
3+
##############
44

55
Slice tree model
66
----------------
77

88
When ``booster`` is set to ``gbtree`` or ``dart``, XGBoost builds a tree model, which is a
99
list of trees and can be sliced into multiple sub-models.
1010

11+
In Python:
12+
1113
.. code-block:: python
1214
1315
from sklearn.datasets import make_classification
@@ -32,6 +34,24 @@ list of trees and can be sliced into multiple sub-models.
3234
trees = [_ for _ in booster]
3335
assert len(trees) == num_boost_round
3436
37+
In R:
38+
39+
.. versionadded:: 3.0.0
40+
41+
.. code-block:: R
42+
43+
data(agaricus.train, package = "xgboost")
44+
dm <- xgb.DMatrix(agaricus.train$data, label = agaricus.train$label)
45+
46+
model <- xgb.train(
47+
params = xgb.params(objective = "binary:logistic", max_depth = 4),
48+
data = dm,
49+
nrounds = 20
50+
)
51+
sliced <- model[seq(3, 7)]
52+
##### xgb.Booster
53+
# of features: 126
54+
# of rounds: 5
3555
3656
The sliced model is a copy of selected trees, that means the model itself is immutable
3757
during slicing. This feature is the basis of `save_best` option in early stopping
@@ -40,4 +60,6 @@ how to combine prediction with sliced trees.
4060

4161
.. note::
4262

43-
The returned model slice doesn't contain attributes like :py:class:`~xgboost.Booster.best_iteration` and :py:class:`~xgboost.Booster.best_score`.
63+
The returned model slice doesn't contain attributes like
64+
:py:class:`~xgboost.Booster.best_iteration` and
65+
:py:class:`~xgboost.Booster.best_score`.

0 commit comments

Comments
 (0)