GA4 BigQuery Export: Complete Setup Guide

Set up the free GA4 BigQuery export to access raw event data. Covers project setup, linking, costs, query examples, and when the export is worth it.

GA4BigQuerydata exportGoogle AnalyticsSQLraw data

GA4’s built-in reports are limited. You can’t query individual events, join with external data, or run custom analysis beyond what the Explorations tool offers. The BigQuery export gives you raw, event-level data — every pageview, every click, every purchase — in a SQL-queryable database.

The daily export is free. The streaming (real-time) export costs money. This guide covers how to set up the daily export, what it costs to query, and whether you actually need it.

Why Export to BigQuery

What GA4 Reports Can’t Do

NeedGA4 UIBigQuery
Query individual user journeysLimited (User Explorer)Full SQL access
Join with CRM dataNoYes (join on user_id)
Custom attribution modelingNo (fixed models)Yes (build your own)
Unsampled data at scaleSampled above thresholdsAlways unsampled
Data beyond retention periodLost after 2/14 monthsRetained indefinitely
Cross-property analysisNoYes (query multiple exports)
Feed data to ML modelsNoYes (BigQuery ML)
Custom funnel analysisLimitedUnlimited

If you’ve hit limits with GA4’s data retention settings, BigQuery is the escape hatch. Your exported data stays in BigQuery forever (or until you delete it), regardless of GA4’s retention window.

Who Actually Needs This

You need BigQuery if:

  • You have more than 100K monthly users and hit sampling limits in GA4
  • You need to join analytics data with backend data (CRM, inventory, etc.)
  • You want custom attribution beyond GA4’s built-in models
  • You need data retention beyond 14 months
  • You’re building dashboards in Looker Studio or Tableau from raw data
  • Your data team wants SQL access to analytics

You don’t need BigQuery if:

  • GA4’s standard reports answer your questions
  • You have fewer than 10K monthly users (sampling isn’t an issue)
  • Nobody on your team knows SQL
  • You just want basic traffic and conversion reporting

Step 1: Set Up a Google Cloud Project

If you don’t already have a Google Cloud account:

  1. Go to console.cloud.google.com
  2. Click Create Project
  3. Name it something recognizable: “Analytics Export” or “[Brand] Data”
  4. Select your billing account (or create one — you need a credit card, but BigQuery has a generous free tier)

Enable the BigQuery API

  1. In Google Cloud Console, go to APIs & Services → Enable APIs
  2. Search for “BigQuery API”
  3. Click Enable

Set Up Billing

BigQuery’s free tier includes:

  • 1 TB of queries per month (free)
  • 10 GB of storage per month (free)

For most GA4 exports, you’ll stay well within the free tier for queries. Storage will exceed 10 GB after a few months of data for high-traffic sites, but BigQuery storage is cheap ($0.02/GB/month for active storage, $0.01/GB/month for long-term).

Cost estimate by traffic level:

Monthly UsersDaily Export SizeMonthly StorageMonthly Query Cost
10K~50 MB/day~1.5 GBFree (well under 1 TB)
100K~500 MB/day~15 GBFree to ~$5
1M~5 GB/day~150 GB$5-20
10M~50 GB/day~1.5 TB$20-100+
  1. Open GA4 → Admin → Property → BigQuery Links
  2. Click Link
  3. Select your Google Cloud project
  4. Choose your options:

Export Options

OptionWhat It DoesCost
Daily exportExports all events from the previous day, once per dayFree (Google pays)
Streaming exportExports events in near-real-time (within minutes)$$$ (you pay per event)
Include advertising identifiersIncludes Google Ads click IDs, campaign dataFree

Recommendation: Start with Daily export only. Streaming costs $0.05 per 200 MB of streamed data, which adds up fast at high traffic volumes.

Data Location

Choose the BigQuery dataset location closest to your team:

  • US (multi-region) — default, cheapest
  • EU (multi-region) — required if GDPR mandates EU data residency
  • Specific region (e.g., us-east1) — for lowest latency if co-locating with other GCP services

Important: You can’t change the location after export starts. Choose carefully.

Frequency

  • Daily — data arrives the next morning (usually by 8 AM in your property’s timezone)
  • Continuous — data streams every few minutes (streaming export, costs money)
  1. Click Submit

The first export will appear within 24-48 hours. It will contain one day of data.

Step 3: Understand the Schema

GA4 exports data in a specific table structure. Each day creates a new table:

your_project.analytics_PROPERTY_ID.events_YYYYMMDD

For example: my-project.analytics_123456789.events_20260401

There’s also an intraday table (if streaming is enabled): events_intraday_YYYYMMDD

Key Columns

ColumnTypeDescription
event_nameSTRINGThe event name (page_view, purchase, etc.)
event_timestampINTEGERUnix microseconds
event_dateSTRINGYYYYMMDD format
user_pseudo_idSTRINGGA4’s anonymous user ID
user_idSTRINGYour custom user_id (if set)
event_paramsRECORDArray of key-value pairs (page_title, page_location, etc.)
user_propertiesRECORDArray of user-scoped properties
deviceRECORDDevice category, browser, OS, etc.
geoRECORDCountry, city, region
traffic_sourceRECORDCampaign, source, medium
ecommerceRECORDTransaction data (if ecommerce events)

The event_params Challenge

Event parameters are stored as nested arrays, not flat columns. To get the page URL from a page_view event, you need:

SELECT
  event_name,
  (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page_url,
  (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_title') AS page_title
FROM
  `your_project.analytics_123456789.events_20260401`
WHERE
  event_name = 'page_view'

This nested structure is the biggest learning curve for people new to GA4 + BigQuery. Every query involves UNNEST to extract parameters.

Step 4: Essential Queries

Total Pageviews by Page

SELECT
  (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page_url,
  COUNT(*) AS pageviews
FROM
  `your_project.analytics_*.events_*`
WHERE
  event_name = 'page_view'
  AND _TABLE_SUFFIX BETWEEN '20260301' AND '20260331'
GROUP BY page_url
ORDER BY pageviews DESC
LIMIT 50

Purchase Revenue by Source/Medium

SELECT
  traffic_source.source,
  traffic_source.medium,
  COUNT(DISTINCT (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'transaction_id')) AS transactions,
  SUM(ecommerce.purchase_revenue) AS revenue
FROM
  `your_project.analytics_*.events_*`
WHERE
  event_name = 'purchase'
  AND _TABLE_SUFFIX BETWEEN '20260301' AND '20260331'
GROUP BY source, medium
ORDER BY revenue DESC

User Journey (All Events for One User)

SELECT
  event_timestamp,
  event_name,
  (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page,
  (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'source') AS source
FROM
  `your_project.analytics_*.events_*`
WHERE
  user_pseudo_id = 'YOUR_USER_ID_HERE'
  AND _TABLE_SUFFIX BETWEEN '20260301' AND '20260331'
ORDER BY event_timestamp

Funnel: Page View → Add to Cart → Purchase

WITH user_events AS (
  SELECT
    user_pseudo_id,
    event_name,
    event_timestamp
  FROM
    `your_project.analytics_*.events_*`
  WHERE
    event_name IN ('page_view', 'add_to_cart', 'purchase')
    AND _TABLE_SUFFIX BETWEEN '20260301' AND '20260331'
)
SELECT
  COUNT(DISTINCT CASE WHEN event_name = 'page_view' THEN user_pseudo_id END) AS viewed,
  COUNT(DISTINCT CASE WHEN event_name = 'add_to_cart' THEN user_pseudo_id END) AS added_to_cart,
  COUNT(DISTINCT CASE WHEN event_name = 'purchase' THEN user_pseudo_id END) AS purchased
FROM user_events

Step 5: Cost Management

Monitor Query Costs

Before running a query, BigQuery shows an estimate in the top right: “This query will process X GB.” At $5 per TB queried, a 100 GB query costs $0.50.

Reduce Costs

  1. Always filter by date — Use _TABLE_SUFFIX BETWEEN to limit which daily tables you scan
  2. Select specific columns — Avoid SELECT * on the full events table
  3. Use partitioned tables — Create materialized views for common queries
  4. Set up cost controls — In Google Cloud, set a custom BigQuery quota to cap daily spend

Scheduled Queries

For recurring reports, use scheduled queries instead of running them manually:

  1. BigQuery → Scheduled Queries → Create
  2. Write your query
  3. Set the schedule (daily, weekly)
  4. Choose a destination table

This runs automatically and stores results — cheaper than querying raw data repeatedly.

Common Issues

Export Not Appearing

SymptomCauseFix
No tables after 48 hoursLink isn’t activeCheck GA4 Admin → BigQuery Links for status
Tables exist but emptyProperty has no trafficVerify GA4 is receiving data in Realtime report
Missing recent datesDaily export lagData arrives by ~8 AM in property timezone
Missing eventsEvent not configured in GA4Check Events list in GA4

Query Returns No Results

  • Check the date range_TABLE_SUFFIX uses YYYYMMDD format (string, not date)
  • Check event name casing — GA4 event names are case-sensitive
  • Check for NULL parameters — Some events don’t include all parameters

Storage Growing Fast

If storage costs concern you:

  1. Set up a table expiration policy (delete data older than X days)
  2. Export only to daily tables (skip streaming)
  3. Create summary tables and delete raw data after processing

Connecting BigQuery to Dashboards

Looker Studio (Free)

  1. In Looker Studio, add a data source
  2. Select BigQuery connector
  3. Choose your project and dataset
  4. Write a custom query or select a table
  5. Build visualizations

Tip: Use a scheduled query to create a summary table, then connect Looker Studio to the summary — much faster than querying raw events.

Tableau / Power BI

Both connect to BigQuery natively. Use the BigQuery ODBC driver for Power BI, or Tableau’s built-in BigQuery connector.

Checklist

  • Google Cloud project created with billing enabled
  • BigQuery API enabled
  • GA4 linked to BigQuery (daily export)
  • Dataset location matches your data residency needs
  • First export arrived (check within 48 hours)
  • Test query runs successfully
  • Cost controls configured (BigQuery quota)
  • Team knows the schema (event_params UNNEST pattern)

The GA4 BigQuery export is the most underused free feature in Google Analytics. Most businesses don’t need it — but for those that do, it transforms GA4 from a reporting tool into a data platform.

Want to make sure GA4 is collecting the right data before you export it? Run a free scan — we verify your event tracking, conversion setup, and data quality.