HuggingFace
To use a ML model hosted on HuggingFace, specify the huggingface.co
path in from
along with the files to include.
Example:
models:
- from: huggingface:huggingface.co/spiceai/darts:latest
name: hf_model
files:
- path: model.onnx
datasets:
- taxi_trips
from
Format​
The from
key follows the following regex format:
\A(huggingface:)(huggingface\.co\/)?(?<org>[\w\-]+)\/(?<model>[\w\-]+)(:(?<revision>[\w\d\-\.]+))?\z
Examples​
huggingface:username/modelname
: Implies the latest version ofmodelname
hosted byusername
.huggingface:huggingface.co/username/modelname:revision
: Specifies a particularrevision
ofmodelname
byusername
, including the optional domain.
Specification​
- Prefix: The value must start with
huggingface:
. - Domain (Optional): Optionally includes
huggingface.co/
immediately after the prefix. Currently no other Huggingface compatible services are supported. - Organization/User: The HuggingFace organization (
org
). - Model Name: After a
/
, the model name (model
). - Revision (Optional): A colon (
:
) followed by the git-like revision identifier (revision
).
Access Tokens​
Access tokens can be provided for Huggingface models in two ways:
- In the Huggingface token cache (i.e.
~/.cache/huggingface/token
). Default. - Via model params (see below).
models:
- name: llama_3.2_1B
from: huggingface:huggingface.co/meta-llama/Llama-3.2-1B
params:
hf_token: ${ secrets:HF_TOKEN }
Limitations
- ML models currently only support ONNX file format.