- A single place to view, filter, and export inference logs (chat completions).
- A unified interface for preparing datasets via SQL queries.
- Reusable datasets for batch inference and fine-tuning.
What lives inside Data Lab
Data Lab stores and manages several types of data:- Inference Logs: automatically collected chat completions generated via API or Playground (unless Zero Data Retention is enabled).
- Filtered Datasets: datasets created from inference logs using sql queries.
- Uploaded Datasets: user-provided datasets uploaded manually.
- Batch Inference Outputs: results generated by batch inference jobs.
- Fine-tuning Outputs: artifacts produced by fine-tuning jobs, including model checkpoints and the resulting fine-tuned model.
Import Chat Completions
Data Lab allows you to import historical chat completion logs into structured datasets for analysis and reuse
Batch Inference
Enables asynchronous processing of large datasets without requiring real-time responses
Fine-tuning
Fine-tuning workflows support preparing, validating, and managing training datasets through Data Lab interface
Data Processing
This section explains how your data is processed, where it is processed, and what level of control you have