Inference Logger
Inference logging allows you to capture and monitor your model's inference requests and responses in KServe. This feature is useful for monitoring model performance, debugging issues, auditing predictions, and collecting data for model improvement.
Prerequisites
Before setting up inference logging, make sure you have:
- A Kubernetes cluster with KServe installed.
- kubectl CLI tool installed and configured.
- Basic knowledge of Kubernetes and KServe concepts.
Logger Configurations
KServe offers several configurations for logging inference data:
- Basic Inference Logger - A simple logger using a message dumper Knative Service
- Request Header Metadata Logger - Log request headers along with inference data
- TLS Enabled Logger - Secure your logs with TLS encryption
- Payload Logger with Knative Eventing - Log through Knative Eventing infrastructure
- Store Logs in Blob Storage - Store logs in cloud blob storage
Each configuration has its own use cases, advantages, and requirements. Choose the most suitable one based on your specific needs.