Configuring Language Models
Spice supports language models (LLMs) from several sources (see model components) and provides configuration for how inference will be performed in the Spice runtime. This includes:
- Providing tools to the language model, enabling it to interact with the Spice runtime.
- Specifying system prompts and overriding defaults for
v1/chat/completion
.
📄️ Default overrides
Learn how to override default LLM hyperparameters in Spice.
📄️ Runtime tools
Learn how LLMs can interact with the spice runtime.