GA4’s built-in reports are limited. You can’t query individual events, join with external data, or run custom analysis beyond what the Explorations tool offers. The BigQuery export gives you raw, event-level data — every pageview, every click, every purchase — in a SQL-queryable database.
The daily export is free. The streaming (real-time) export costs money. This guide covers how to set up the daily export, what it costs to query, and whether you actually need it.
Why Export to BigQuery
What GA4 Reports Can’t Do
| Need | GA4 UI | BigQuery |
|---|---|---|
| Query individual user journeys | Limited (User Explorer) | Full SQL access |
| Join with CRM data | No | Yes (join on user_id) |
| Custom attribution modeling | No (fixed models) | Yes (build your own) |
| Unsampled data at scale | Sampled above thresholds | Always unsampled |
| Data beyond retention period | Lost after 2/14 months | Retained indefinitely |
| Cross-property analysis | No | Yes (query multiple exports) |
| Feed data to ML models | No | Yes (BigQuery ML) |
| Custom funnel analysis | Limited | Unlimited |
If you’ve hit limits with GA4’s data retention settings, BigQuery is the escape hatch. Your exported data stays in BigQuery forever (or until you delete it), regardless of GA4’s retention window.
Who Actually Needs This
You need BigQuery if:
- You have more than 100K monthly users and hit sampling limits in GA4
- You need to join analytics data with backend data (CRM, inventory, etc.)
- You want custom attribution beyond GA4’s built-in models
- You need data retention beyond 14 months
- You’re building dashboards in Looker Studio or Tableau from raw data
- Your data team wants SQL access to analytics
You don’t need BigQuery if:
- GA4’s standard reports answer your questions
- You have fewer than 10K monthly users (sampling isn’t an issue)
- Nobody on your team knows SQL
- You just want basic traffic and conversion reporting
Step 1: Set Up a Google Cloud Project
If you don’t already have a Google Cloud account:
- Go to console.cloud.google.com
- Click Create Project
- Name it something recognizable: “Analytics Export” or “[Brand] Data”
- Select your billing account (or create one — you need a credit card, but BigQuery has a generous free tier)
Enable the BigQuery API
- In Google Cloud Console, go to APIs & Services → Enable APIs
- Search for “BigQuery API”
- Click Enable
Set Up Billing
BigQuery’s free tier includes:
- 1 TB of queries per month (free)
- 10 GB of storage per month (free)
For most GA4 exports, you’ll stay well within the free tier for queries. Storage will exceed 10 GB after a few months of data for high-traffic sites, but BigQuery storage is cheap ($0.02/GB/month for active storage, $0.01/GB/month for long-term).
Cost estimate by traffic level:
| Monthly Users | Daily Export Size | Monthly Storage | Monthly Query Cost |
|---|---|---|---|
| 10K | ~50 MB/day | ~1.5 GB | Free (well under 1 TB) |
| 100K | ~500 MB/day | ~15 GB | Free to ~$5 |
| 1M | ~5 GB/day | ~150 GB | $5-20 |
| 10M | ~50 GB/day | ~1.5 TB | $20-100+ |
Step 2: Link GA4 to BigQuery
- Open GA4 → Admin → Property → BigQuery Links
- Click Link
- Select your Google Cloud project
- Choose your options:
Export Options
| Option | What It Does | Cost |
|---|---|---|
| Daily export | Exports all events from the previous day, once per day | Free (Google pays) |
| Streaming export | Exports events in near-real-time (within minutes) | $$$ (you pay per event) |
| Include advertising identifiers | Includes Google Ads click IDs, campaign data | Free |
Recommendation: Start with Daily export only. Streaming costs $0.05 per 200 MB of streamed data, which adds up fast at high traffic volumes.
Data Location
Choose the BigQuery dataset location closest to your team:
- US (multi-region) — default, cheapest
- EU (multi-region) — required if GDPR mandates EU data residency
- Specific region (e.g., us-east1) — for lowest latency if co-locating with other GCP services
Important: You can’t change the location after export starts. Choose carefully.
Frequency
- Daily — data arrives the next morning (usually by 8 AM in your property’s timezone)
- Continuous — data streams every few minutes (streaming export, costs money)
- Click Submit
The first export will appear within 24-48 hours. It will contain one day of data.
Step 3: Understand the Schema
GA4 exports data in a specific table structure. Each day creates a new table:
your_project.analytics_PROPERTY_ID.events_YYYYMMDD
For example: my-project.analytics_123456789.events_20260401
There’s also an intraday table (if streaming is enabled): events_intraday_YYYYMMDD
Key Columns
| Column | Type | Description |
|---|---|---|
event_name | STRING | The event name (page_view, purchase, etc.) |
event_timestamp | INTEGER | Unix microseconds |
event_date | STRING | YYYYMMDD format |
user_pseudo_id | STRING | GA4’s anonymous user ID |
user_id | STRING | Your custom user_id (if set) |
event_params | RECORD | Array of key-value pairs (page_title, page_location, etc.) |
user_properties | RECORD | Array of user-scoped properties |
device | RECORD | Device category, browser, OS, etc. |
geo | RECORD | Country, city, region |
traffic_source | RECORD | Campaign, source, medium |
ecommerce | RECORD | Transaction data (if ecommerce events) |
The event_params Challenge
Event parameters are stored as nested arrays, not flat columns. To get the page URL from a page_view event, you need:
SELECT
event_name,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page_url,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_title') AS page_title
FROM
`your_project.analytics_123456789.events_20260401`
WHERE
event_name = 'page_view'
This nested structure is the biggest learning curve for people new to GA4 + BigQuery. Every query involves UNNEST to extract parameters.
Step 4: Essential Queries
Total Pageviews by Page
SELECT
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page_url,
COUNT(*) AS pageviews
FROM
`your_project.analytics_*.events_*`
WHERE
event_name = 'page_view'
AND _TABLE_SUFFIX BETWEEN '20260301' AND '20260331'
GROUP BY page_url
ORDER BY pageviews DESC
LIMIT 50
Purchase Revenue by Source/Medium
SELECT
traffic_source.source,
traffic_source.medium,
COUNT(DISTINCT (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'transaction_id')) AS transactions,
SUM(ecommerce.purchase_revenue) AS revenue
FROM
`your_project.analytics_*.events_*`
WHERE
event_name = 'purchase'
AND _TABLE_SUFFIX BETWEEN '20260301' AND '20260331'
GROUP BY source, medium
ORDER BY revenue DESC
User Journey (All Events for One User)
SELECT
event_timestamp,
event_name,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'source') AS source
FROM
`your_project.analytics_*.events_*`
WHERE
user_pseudo_id = 'YOUR_USER_ID_HERE'
AND _TABLE_SUFFIX BETWEEN '20260301' AND '20260331'
ORDER BY event_timestamp
Funnel: Page View → Add to Cart → Purchase
WITH user_events AS (
SELECT
user_pseudo_id,
event_name,
event_timestamp
FROM
`your_project.analytics_*.events_*`
WHERE
event_name IN ('page_view', 'add_to_cart', 'purchase')
AND _TABLE_SUFFIX BETWEEN '20260301' AND '20260331'
)
SELECT
COUNT(DISTINCT CASE WHEN event_name = 'page_view' THEN user_pseudo_id END) AS viewed,
COUNT(DISTINCT CASE WHEN event_name = 'add_to_cart' THEN user_pseudo_id END) AS added_to_cart,
COUNT(DISTINCT CASE WHEN event_name = 'purchase' THEN user_pseudo_id END) AS purchased
FROM user_events
Step 5: Cost Management
Monitor Query Costs
Before running a query, BigQuery shows an estimate in the top right: “This query will process X GB.” At $5 per TB queried, a 100 GB query costs $0.50.
Reduce Costs
- Always filter by date — Use
_TABLE_SUFFIX BETWEENto limit which daily tables you scan - Select specific columns — Avoid
SELECT *on the full events table - Use partitioned tables — Create materialized views for common queries
- Set up cost controls — In Google Cloud, set a custom BigQuery quota to cap daily spend
Scheduled Queries
For recurring reports, use scheduled queries instead of running them manually:
- BigQuery → Scheduled Queries → Create
- Write your query
- Set the schedule (daily, weekly)
- Choose a destination table
This runs automatically and stores results — cheaper than querying raw data repeatedly.
Common Issues
Export Not Appearing
| Symptom | Cause | Fix |
|---|---|---|
| No tables after 48 hours | Link isn’t active | Check GA4 Admin → BigQuery Links for status |
| Tables exist but empty | Property has no traffic | Verify GA4 is receiving data in Realtime report |
| Missing recent dates | Daily export lag | Data arrives by ~8 AM in property timezone |
| Missing events | Event not configured in GA4 | Check Events list in GA4 |
Query Returns No Results
- Check the date range —
_TABLE_SUFFIXuses YYYYMMDD format (string, not date) - Check event name casing — GA4 event names are case-sensitive
- Check for NULL parameters — Some events don’t include all parameters
Storage Growing Fast
If storage costs concern you:
- Set up a table expiration policy (delete data older than X days)
- Export only to daily tables (skip streaming)
- Create summary tables and delete raw data after processing
Connecting BigQuery to Dashboards
Looker Studio (Free)
- In Looker Studio, add a data source
- Select BigQuery connector
- Choose your project and dataset
- Write a custom query or select a table
- Build visualizations
Tip: Use a scheduled query to create a summary table, then connect Looker Studio to the summary — much faster than querying raw events.
Tableau / Power BI
Both connect to BigQuery natively. Use the BigQuery ODBC driver for Power BI, or Tableau’s built-in BigQuery connector.
Checklist
- Google Cloud project created with billing enabled
- BigQuery API enabled
- GA4 linked to BigQuery (daily export)
- Dataset location matches your data residency needs
- First export arrived (check within 48 hours)
- Test query runs successfully
- Cost controls configured (BigQuery quota)
- Team knows the schema (event_params UNNEST pattern)
The GA4 BigQuery export is the most underused free feature in Google Analytics. Most businesses don’t need it — but for those that do, it transforms GA4 from a reporting tool into a data platform.
Want to make sure GA4 is collecting the right data before you export it? Run a free scan — we verify your event tracking, conversion setup, and data quality.