Free Starter Pack Before you start using Claude for Excel, apply these best practices
← All Integrations
Google Cloud Storage Logo

Google Cloud Storage + Go Fig

Database

Connect Google Cloud Storage to Go Fig for analytics on your cloud-stored data files.

Google Cloud Storage holds your data exports, backups, and data lake files. Go Fig connects directly to GCS buckets, letting you query Parquet, CSV, and JSON files alongside your other business data. Turn your cloud storage into an analytics-ready data source without building pipelines.

Key facts

File formats
Parquet, CSV, JSON, Avro
Auth
Service Account or Workload Identity
Partitioning
Hive-style path partitions
Sync mode
Object-version aware

SOC 2 Type II ยท All integrations

What you can do with Google Cloud Storage data in Go Fig

Data Lake Analytics

Query Parquet and CSV files in GCS directly from Go Fig dashboards.

Export Integration

Analyze data exports from other systems stored in GCS buckets.

Historical Data Access

Connect archived data in cloud storage to current operational data.

Data available from Google Cloud Storage

Go Fig extracts and normalizes the following data from your Google Cloud Storage account:

Parquet files
CSV files
JSON documents
Bucket contents
Nested directories
Partitioned data
Data exports
Archived records

How to connect Google Cloud Storage

1

Create a service account

In GCP IAM, create a service account scoped to the buckets Go Fig should read. Grant Storage Object Viewer (read) at the bucket or prefix level rather than project-wide. Download the JSON key (or configure Workload Identity Federation for keyless auth).

2

Point Go Fig at your buckets and prefixes

Specify bucket names and optional prefixes (e.g., gs://my-bucket/exports/quickbooks/). Go Fig handles Hive-style partitioning automatically (year=/month=/day=), so you can query date-partitioned exports without flattening files first.

3

Declare schema or let Go Fig infer it

Parquet files are read with the embedded schema. For CSV and JSON, Go Fig auto-infers types from the first 1000 rows or you can pin a schema definition. Schema drift is detected and surfaced rather than silently coerced.

4

Set sync cadence

By default Go Fig polls for new files every 15 minutes. For low-latency use cases (e.g., reacting to a Cloud Function export), Pub/Sub notifications can trigger near-real-time ingestion.

Authentication: Service Account JSON key with Storage Object Viewer role on the buckets you choose. Workload Identity Federation is also supported if you don't want to manage long-lived keys.

Common Questions About Google Cloud Storage Integration

Which file formats does Go Fig support from GCS?

Parquet, CSV, TSV, JSON, JSON Lines (NDJSON), Avro, and ORC. Parquet is the recommended format because schema is embedded and partition pruning is significantly faster. For CSV uploads from non-technical sources, Go Fig handles common quirks (BOM, mixed encodings, irregular quoting) without requiring file pre-processing.

How does Go Fig handle Hive-style partitioned data?

Native support for partition columns (year=2024/month=04/day=15). Partitions become queryable columns automatically and partition pruning happens at the GCS list-objects level so you don't pay for scanning files outside your filter. This matters when you have years of historical exports in a bucket.

Can Go Fig read from a GCS bucket that's private or VPC-restricted?

Yes. Bucket-level IAM (via the service account) is the standard path. For VPC Service Controls or Private Google Access, Go Fig's egress IPs can be allowlisted to your perimeter, or you can run our connector inside your own VPC for fully private access.

What's the freshness for newly-written GCS files?

Default polling cadence is 15 minutes. For real-time, configure Pub/Sub notifications on the bucket and Go Fig will ingest within seconds of file write. Most analytics use cases don't need sub-minute freshness, but the option exists for things like operational dashboards.

Can I write data back to GCS from Go Fig?

Yes. Flow outputs can write to a designated GCS bucket as Parquet, CSV, or JSON. Common patterns include exporting daily P&L snapshots for downstream BigQuery loading, or writing reconciled finance data back to a partitioned bucket for the data team to consume.

Industries running Google Cloud Storage with Go Fig

Strategic CFOs in these industries typically stitch Google Cloud Storage into their Financial Intelligence Graph alongside their ERP and operational systems.

Ready to connect Google Cloud Storage?

See how your Google Cloud Storage data looks in Go Fig with a personalized demo.

Book a Demo