Free Starter Pack Before you start using Claude for Excel, apply these best practices
← All Integrations
Databricks Logo

Databricks + Go Fig

Data Stack

Connect Databricks lakehouse data to Go Fig for unified analytics and AI-powered financial insights.

Databricks is where many data teams consolidate the lakehouse, and finance is increasingly a downstream consumer of gold-layer tables. Go Fig connects to Databricks SQL Warehouses via the official driver, so Celeste and AI financial analysts can traverse Unity Catalog, query Delta tables in the Financial Intelligence Graph, and join them to GL, CRM, billing, and HRIS data already wired in. The connector honors Unity Catalog's three-level namespace (catalog.schema.table), row-level filters, and column masks, so governance policies set by the data platform team travel with the data into Go Fig. Delta Lake features work as expected: time travel (AS OF VERSION or AS OF TIMESTAMP) for historical snapshots, CDF (Change Data Feed) for incremental reads when enabled, and partition pruning on date columns. Serverless SQL Warehouses are recommended for BI workloads since they start in seconds and auto-suspend, making on-demand finance queries cheap. For regulated workloads, Go Fig supports Private Link (AWS), Private Endpoints (Azure), and OAuth M2M for keyless authentication.

Key facts

Compute
SQL Warehouse (Serverless or Classic)
Governance
Unity Catalog with row/column masks
Incremental sync
Delta Change Data Feed (CDF)
Time travel
AS OF VERSION or TIMESTAMP
Auth
OAuth M2M, PAT, SAML SSO

SOC 2 Type II ยท All integrations

What you can do with Databricks data in Go Fig

Gold-layer reporting in finance

Let finance consume the data team's curated gold-layer Delta tables directly. Celeste can query them for revenue, pipeline, and unit-economics metrics without a parallel pipeline.

ML-enriched financial analysis

Join MLflow-scored churn, LTV, or demand-forecast outputs stored in Delta to GL and bookings data, so finance decisions reflect the data science team's models.

Lakehouse-first close

Pull close-process tables (journal detail, subledger summaries, reconciliations) curated in Databricks alongside QuickBooks or NetSuite for a governed month-end review.

Data available from Databricks

Go Fig extracts and normalizes the following data from your Databricks account:

Delta Lake tables
Unity Catalog schemas
Materialized views
Streaming tables
External tables (Iceberg, Hive)
MLflow model outputs
Delta Change Data Feed
Time-travel snapshots
Partition metadata
Row-filter-aware reads

How to connect Databricks

1

Create a Serverless SQL Warehouse for BI workloads

In Databricks, provision a Serverless SQL Warehouse (Small is usually enough for finance BI) with auto-stop at 10 minutes. Serverless is preferred over Classic for this use case because cold starts are measured in seconds and you don't pay for idle capacity. Copy the HTTP Path (/sql/1.0/warehouses/...) and hostname.

2

Create a service principal and grant Unity Catalog privileges

In the Databricks account console, create a service principal fig-reader and generate OAuth M2M credentials. In Unity Catalog, GRANT USE CATALOG on the target catalog, USE SCHEMA on relevant schemas, and SELECT on the specific tables and views Go Fig should read. Scope to gold-layer or finance-specific schemas rather than granting broadly.

3

Enable Change Data Feed on incremental tables

For tables Go Fig will sync incrementally, enable CDF with ALTER TABLE ... SET TBLPROPERTIES (delta.enableChangeDataFeed = true). Go Fig then reads table_changes() for row-level deltas rather than full table scans, which is dramatically cheaper on large fact tables. For append-only tables, a monotonic timestamp column works too.

4

Configure network and connect

If the workspace is behind Private Link, configure the link to Go Fig's tenant or use a customer-deployed runner inside your VPC. Paste hostname, HTTP Path, and OAuth credentials into Go Fig. The connector enumerates catalogs, schemas, and tables through the information_schema respecting Unity Catalog privileges.

Authentication: OAuth machine-to-machine (M2M) via a Databricks service principal is the recommended production path. Assign the service principal minimum Unity Catalog privileges (USE CATALOG, USE SCHEMA, SELECT on specific tables). Personal Access Tokens (PATs) are supported for POCs. For private workspaces, use AWS PrivateLink or Azure Private Link to Go Fig. The connector uses the Databricks SQL driver and respects Unity Catalog's fine-grained access control.

Common Questions About Databricks Integration

Does Go Fig work with Unity Catalog's row filters and column masks?

Yes. Because Go Fig connects through the SQL Warehouse as a specific service principal, Unity Catalog enforces row filters and column masks at query time based on that principal's identity. You don't have to re-implement governance inside Go Fig; what the service principal can see is exactly what Go Fig sees.

Serverless SQL Warehouse vs. Classic, which should I use?

Serverless for almost every finance BI use case. Cold starts are single-digit seconds, you pay only while queries run, and auto-stop keeps costs bounded. Classic is worth it only if you're doing heavy continuous workloads that justify a persistently running cluster, which is rare for finance reporting.

Can Go Fig do incremental reads on large Delta tables?

Yes, two ways. If Change Data Feed (CDF) is enabled on the table, Go Fig reads table_changes() for true row-level CDC between syncs. If CDF is not enabled, Go Fig can use a monotonic high-watermark column (last_updated_at, sequence) for append-only or update-timestamp patterns. Full refresh is the fallback for small tables.

Does Go Fig support Delta time travel for historical snapshots?

Yes. You can point a Go Fig sync at a specific table version (AS OF VERSION 42) or timestamp (AS OF TIMESTAMP '2026-01-01'). This is useful for reproducing a historical close or for audit trails where finance needs to re-run a report as of a specific date.

What about non-Delta external tables (Iceberg, Hive)?

Unity Catalog external tables, including Iceberg and Hive formats, are accessible through the SQL Warehouse just like Delta tables. Performance and CDC semantics vary by format (CDF is Delta-only), but standard SELECT reads and partition pruning work across formats.

Ready to connect Databricks?

See how your Databricks data looks in Go Fig with a personalized demo.

Book a Demo