neptunecontrib.monitoring.fastai
¶
Module Contents¶
Classes¶
|
Logs metrics from the fastai learner to Neptune. |
-
class
neptunecontrib.monitoring.fastai.
LearnerCallback
¶
-
class
neptunecontrib.monitoring.fastai.
NeptuneMonitor
(learn=None, experiment=None, prefix='')¶ Bases:
fastai.basic_train.LearnerCallback
Logs metrics from the fastai learner to Neptune.
Goes over the last_metrics and smooth_loss after each batch and epoch and logs them to appropriate Neptune channels.
See the example experiment here https://ui.neptune.ai/neptune-ai/neptune-examples/e/NEP-493/charts.
- Parameters
experiment (neptune.experiments.Experiment) – Neptune experiment.
prefix (str) – Prefix that should be added before the metric_name and valid_name before logging to the appropriate channel. Defaul is ‘’.
Examples
Prepare data:
from fastai.vision import * path = untar_data(URLs.MNIST_TINY) data = ImageDataBunch.from_folder(path, ds_tfms=(rand_pad(2, 28), []), bs=64) data.normalize(imagenet_stats) learn = cnn_learner(data, models.resnet18, metrics=accuracy) learn.lr_find() learn.recorder.plot()
Now, create Neptune experiment, instantiate the monitor and pass it to callbacks:
import neptune from neptunecontrib.monitoring.fastai import NeptuneMonitor neptune.init(qualified_project_name='USER_NAME/PROJECT_NAME') with neptune.create_experiment(): learn = create_cnn(data, models.resnet18, metrics=accuracy, callbacks_fns=[NeptuneMonitor]) learn.fit_one_cycle(20, 1e-2)
Note
you need to have the fastai library installed on your computer to use this module.
-
on_epoch_end
(self, **kwargs)¶
-
on_batch_end
(self, last_loss, iteration, train, **kwargs)¶