Skip to main content

Databricks Data Connector

Databricks as a connector for federated SQL query against Databricks using Spark Connect or directly from Delta Lake tables.

datasets:
- from: databricks:spiceai.datasets.my_awesome_table # A reference to a table in the Databricks unity catalog
name: my_delta_lake_table
params:
mode: delta_lake
databricks_endpoint: dbc-a1b2345c-d6e7.cloud.databricks.com
databricks_token: ${secrets:my_token}
databricks_aws_access_key_id: ${secrets:aws_access_key_id}
databricks_aws_secret_access_key: ${secrets:aws_secret_access_key}

Configuration​

from​

The from field for the Databricks connector takes the form databricks:catalog.schema.table where catalog.schema.table is the fully-qualified path to the table to read from.

name​

The dataset name. This will be used as the table name within Spice.

Example:

datasets:
- from: databricks:spiceai.datasets.my_awesome_table
name: cool_dataset
params: ...
SELECT COUNT(*) FROM cool_dataset;
+----------+
| count(*) |
+----------+
| 6001215 |
+----------+

params​

Use the secret replacement syntax to reference a secret, e.g. ${secrets:my_token}.

Parameter NameDescription
modeThe execution mode for querying against Databricks. The default is spark_connect. Possible values:
  • spark_connect: Use Spark Connect to query against Databricks. Requires a Spark cluster to be available.
  • delta_lake: Query directly from Delta Tables. Requires the object store credentials to be provided.
databricks_endpointThe endpoint of the Databricks instance. Required for both modes.
databricks_cluster_idThe ID of the compute cluster in Databricks to use for the query. Only valid when mode is spark_connect.
databricks_use_sslIf true, use a TLS connection to connect to the Databricks endpoint. Default is true.
client_timeoutOptional. Applicable only in delta_lake mode. Specifies timeout for object store operations. Default value is 30s E.g. client_timeout: 60s

Delta Lake object store parameters​

Configure the connection to the object store when using mode: delta_lake. Use the secret replacement syntax to reference a secret, e.g. ${secrets:aws_access_key_id}.

AWS S3​

Parameter NameDescription
databricks_aws_regionOptional. The AWS region for the S3 object store. E.g. us-west-2.
databricks_aws_access_key_idThe access key ID for the S3 object store.
databricks_aws_secret_access_keyThe secret access key for the S3 object store.
databricks_aws_endpointOptional. The endpoint for the S3 object store. E.g. s3.us-west-2.amazonaws.com.

Azure Blob​

Note

One of the following auth values must be provided for Azure Blob:

  • databricks_azure_storage_account_key,
  • databricks_azure_storage_client_id and azure_storage_client_secret, or
  • databricks_azure_storage_sas_key.
Parameter NameDescription
databricks_azure_storage_account_nameThe Azure Storage account name.
databricks_azure_storage_account_keyThe Azure Storage key for accessing the storage account.
databricks_azure_storage_client_idThe Service Principal client ID for accessing the storage account.
databricks_azure_storage_client_secretThe Service Principal client secret for accessing the storage account.
databricks_azure_storage_sas_keyThe shared access signature key for accessing the storage account.
databricks_azure_storage_endpointOptional. The endpoint for the Azure Blob storage account.

Google Storage (GCS)​

Parameter NameDescription
google_service_accountFilesystem path to the Google service account JSON key file.

Examples​

Spark Connect​

- from: databricks:spiceai.datasets.my_spark_table # A reference to a table in the Databricks unity catalog
name: my_delta_lake_table
params:
mode: spark_connect
databricks_endpoint: dbc-a1b2345c-d6e7.cloud.databricks.com
databricks_cluster_id: 1234-567890-abcde123
databricks_token: ${secrets:my_token}

Delta Lake (S3)​

- from: databricks:spiceai.datasets.my_delta_table # A reference to a table in the Databricks unity catalog
name: my_delta_lake_table
params:
mode: delta_lake
databricks_endpoint: dbc-a1b2345c-d6e7.cloud.databricks.com
databricks_token: ${secrets:my_token}
databricks_aws_region: us-west-2 # Optional
databricks_aws_access_key_id: ${secrets:aws_access_key_id}
databricks_aws_secret_access_key: ${secrets:aws_secret_access_key}
databricks_aws_endpoint: s3.us-west-2.amazonaws.com # Optional

Delta Lake (Azure Blobs)​

- from: databricks:spiceai.datasets.my_adls_table # A reference to a table in the Databricks unity catalog
name: my_delta_lake_table
params:
mode: delta_lake
databricks_endpoint: dbc-a1b2345c-d6e7.cloud.databricks.com
databricks_token: ${secrets:my_token}

# Account Name + Key
databricks_azure_storage_account_name: my_account
databricks_azure_storage_account_key: ${secrets:my_key}

# OR Service Principal + Secret
databricks_azure_storage_client_id: my_client_id
databricks_azure_storage_client_secret: ${secrets:my_secret}

# OR SAS Key
databricks_azure_storage_sas_key: my_sas_key

Delta Lake (GCP)​

- from: databricks:spiceai.datasets.my_gcp_table # A reference to a table in the Databricks unity catalog
name: my_delta_lake_table
params:
mode: delta_lake
databricks_endpoint: dbc-a1b2345c-d6e7.cloud.databricks.com
databricks_token: ${secrets:my_token}
databricks_google_service_account_path: /path/to/service-account.json

Types​

mode: delta_lake​

The table below shows the Databricks (mode: delta_lake) data types supported, along with the type mapping to Apache Arrow types in Spice.

Databricks SQL TypeArrow Type
STRINGUtf8
BIGINTInt64
INTInt32
SMALLINTInt16
TINYINTInt8
FLOATFloat32
DOUBLEFloat64
BOOLEANBoolean
BINARYBinary
DATEDate32
TIMESTAMPTimestamp(Microsecond, Some("UTC"))
TIMESTAMP_NTZTimestamp(Microsecond, None)
DECIMALDecimal128
ARRAYList
STRUCTStruct
MAPMap

Secrets​

Spice integrates with multiple secret stores to help manage sensitive data securely. For detailed information on supported secret stores, refer to the secret stores documentation. Additionally, learn how to use referenced secrets in component parameters by visiting the using referenced secrets guide.