Read a Delta table
This recipe reads a Unity Catalog table using the Databricks SQL Connector.
Code snippet
app.py
import streamlit as st
from databricks import sql
from databricks.sdk.core import Config
cfg = Config() # Set the DATABRICKS_HOST environment variable when running locally
@st.cache_resource(ttl="1h") # connection is cached
def get_connection(http_path):
return sql.connect(
server_hostname=cfg.host,
http_path=http_path,
credentials_provider=lambda: cfg.authenticate,
)
def read_table(table_name, conn):
with conn.cursor() as cursor:
query = f"SELECT * FROM {table_name}"
cursor.execute(query)
return cursor.fetchall_arrow().to_pandas()
http_path_input = st.text_input(
"Enter your Databricks HTTP Path:", placeholder="/sql/1.0/warehouses/xxxxxx"
)
table_name = st.text_input(
"Specify a Unity Catalog table name:", placeholder="catalog.schema.table"
)
if http_path_input and table_name:
conn = get_connection(http_path_input)
df = read_table(table_name, conn)
st.dataframe(df)
info
This sample uses Streamlit's st.cache_resource with a 1-hour TTL (time-to-live) to cache the database connection across users, sessions, and reruns. The cached connection will automatically expire after 1 hour, ensuring connections don't become stale. Use Streamlit's caching decorators and TTL parameter to implement a caching strategy that works for your use case.
Resources
Permissions
Your app service principal needs the following permissions:
SELECTon the Unity Catalog tableCAN USEon the SQL warehouse
See Unity Catalog privileges and securable objects for more information.
Dependencies
- Databricks SDK -
databricks-sdk - Databricks SQL Connector -
databricks-sql-connector - Streamlit -
streamlit
requirements.txt
databricks-sdk
databricks-sql-connector
streamlit