You can use Neptune integration with XGBoost to capture model training metadata through NeptuneCallback.
You can find detailed information on how to install and use the integration in the user guide.
Neptune callback for logging metadata during XGBoost model training.
This callback logs metrics, all parameters, learning rate, pickled model, visualizations. If early stopping is activated "best_score" and "best_iteration" are also logged.
Metrics are logged for every dataset in the
evals list and for every metric specified. For example with
evals = [(dtrain, "train"), (dval, "valid")] and
"eval_metric": ["mae", "rmse"], 4 metrics are created -
Callback works with
xgboost.cv() functions, and with the sklearn API
This requires graphviz library to work, read how to install it in the user guide.
# Create runimport neptune.new as neptunerun = neptune.init(project="WORKSPACE/PROJECT")# Create Neptune callback and pass it to xgb.train() functionfrom neptune.new.integrations.xgboost import NeptuneCallbackneptune_callback = NeptuneCallback(run=run)xgb.train(...callbacks=[neptune_callback],)# When creating the callback you can customize what you want to log and whereneptune_callback = NeptuneCallback(run=run,base_namespace="experiment",log_model=False,log_tree=[0, 1, 2, 3],)