Upload a file
This recipe uploads a file to a Unity Catalog volume using the Databricks SDK for Python.
Code snippet
import io
import streamlit as st
from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
uploaded_file = st.file_uploader(label="Select file")
upload_volume_path = st.text_input(
label="Specify a three-level Unity Catalog volume name (catalog.schema.volume_name)",
placeholder="main.marketing.raw_files",
)
if st.button("Save changes"):
file_bytes = uploaded_file.read()
binary_data = io.BytesIO(file_bytes)
file_name = uploaded_file.name
parts = upload_volume_path.strip().split(".")
catalog = parts[0]
schema = parts[1]
volume_name = parts[2]
volume_file_path = f"/Volumes/{catalog}/{schema}/{volume_name}/{file_name}"
w.files.upload(volume_file_path, binary_data, overwrite=True)
Resources
Permissions
Your app service principal needs the following permissions:
USE CATALOGon the catalog of the volumeUSE SCHEMAon the schema of the volumeREAD VOLUMEandWRITE VOLUMEon the volume
See Privileges required for volume operations for more information.
If you declare volume access in a Databricks Asset Bundle, resources.apps[*].resources[*].uc_securable may not grant USE_CATALOG and USE_SCHEMA on the parent catalog and schema (the app still needs them at runtime). As a temporary workaround until bundles can declare those parent grants, add the privileges manually, or see apps_grants_sync: an example Databricks App and Asset Bundle that wires experimental.scripts.postdeploy so parent privileges are applied after each databricks bundle deploy (copy its tools/ into your bundle or mirror the same pattern in databricks.yml).
Dependencies
- Databricks SDK for Python -
databricks-sdk - Streamlit -
streamlit
databricks-sdk
streamlit