Skip to main content

Configuring Language Models

Spice supports language models (LLMs) from several sources (see model components) and provides configuration for how inference will be performed in the Spice runtime. This includes:

  • Providing tools to the language model, enabling it to interact with the Spice runtime.
  • Specifying system prompts and overriding defaults for v1/chat/completion.