Home#
Neptune is an experiment tracker. It enables researchers to monitor their model training, visualize and compare model metadata, and collaborate on AI/ML projects within a team.
-
 Get an overview
What is Neptune? What can you do with it? How does it work?
-
 Try it out
See Neptune in action with our 5-minute "Hello Neptune" example.
-
 Set it up
Interested in self-hosting/on-prem?
-
 Get on board
Understand how to get your team started with Neptune and explore best practices.
-
What's new in the docs
Date Changes Sep 24, 2024 Added examples of logging Evidently LLM evaluation reports to Neptune. Sep 20, 2024 Clarified how SSO works in Neptune. Added self-hosting FAQ entries. Highlighted usage of a custom monitoring namespace. Sep 6, 2024 New how-tos: Set up a run deletion script to automatically free up storage, and log a custom hash when tracking artifact versions. Added examples for managing model stage and querying model metadata when tracking models using runs. Aug 23, 2024 New how-tos: Capture external logs from separate scripts, troubleshoot the timestamp must be non-decreasing
error, and access production-ready models using runs.Jul 22, 2024 New how-to: Set custom run color Jul 7, 2024 New integration guide: Great Expectations OSS Jun 25, 2024 Revamped docs for the updated web app.
Documented reports and run groups.
Documented offline logging.
Added tip for improving batching when creating multiple series fields in a single statement.
Jun 18, 2024 Added Studio example to PyTorch Lightning integration guide. Clarified logging of system metrics and best practices. Added troubleshooting steps for long load times when using Google Load Balancer with self-hosted Neptune. Mar 25, 2024 Documented the Neptune Query Language, which can be used to filter runs or models queried from a project. New how-to: Set the logging level. Looking for product release notes? See the Changelog