Prompt presets let you save and reuse model setups you’ve tested in Nebius Token Factory, making it easy to manage and share configurations. In the Inference playground, you can store the model parameters, system prompt, and few-shot examples as a prompt preset. Saved presets appear on the Prompt presets page, where you can:Documentation Index
Fetch the complete documentation index at: https://docs.tokenfactory.nebius.com/llms.txt
Use this file to discover all available pages before exploring further.
- Reopen them in the playground for further testing
- Export them as code for integration into your application
- Share them with your team
Prompt preset contents
A prompt preset is based on a single model setup from the playground and includes:- Text-to-text model — Other model types are not supported.
- Model parameters available in the playground (e.g., temperature, max tokens). Parameters available only via API are not included.
- System prompt (if added in your setup).
- Few-shot examples — user prompts and AI responses you’ve added.
Creating a preset
- In the left panel, click Save preset. If using Compare mode, click → Save preset in the panel of the desired model.
- In the dialog, enter a name and (optional) tags.
- Click Save preset.
Using presets
You can manage all saved presets on the Prompt presets page.- Open in playground — Click a preset to load it back into the playground.
- View as code — Click → View code to generate code for use in your app.
- Share — Click → Share to copy a URL.
Recipients must have a Nebius Token Factory account to access presets and other inference features.