Symptom Library
Common Adobe Analytics Issues
Data Collection Problems
| Symptom | Error Message | Severity | Business Impact |
|---|---|---|---|
| Missing Page Views | "Page not being tracked" | High | Incomplete funnel analysis |
| Server Call Discrepancies | None - volume mismatch | Medium | Billing concerns |
| Props/eVars Not Populating | "undefined" in reports | High | Lost attribution data |
| Events Not Firing | No error - events = 0 | High | Conversion tracking broken |
| Timestamp Mismatch | Future/past timestamps | Medium | Reporting inaccuracy |
Identity & Visitor Tracking
| Symptom | Indicator | Impact |
|---|---|---|
| Visitor ID Not Persisting | New visitor every session | Inflated visitor counts |
| Cross-Domain Tracking Broken | Domain changes reset visitor | Fragmented user journeys |
| Mobile App ID Issues | App reinstalls = new visitors | Poor retention metrics |
| ECID Not Set | Missing Experience Cloud ID | Integration failures |
Reporting Discrepancies
- Real-time reports vs. standard reports mismatch
- Data latency exceeding 2 hours
- Missing data in specific date ranges
- Report suite data not aggregating correctly
- Calculated metrics returning unexpected values
Investigation Workflow
Triage Process
Step 1: Verify Tracking Implementation
// Enable Adobe Analytics debugger in browser console
_satellite.setDebug(true); // If using Launch/Tags
// Or use Adobe Experience Cloud Debugger extension
// Check for:
// 1. Analytics beacon firing
// 2. Correct report suite ID
// 3. Props/eVars populated
// 4. Events triggering
// Verify tracking code version
if (typeof s !== 'undefined') {
console.log('AppMeasurement version:', s.version);
console.log('Report Suite:', s.account);
console.log('Tracking Server:', s.trackingServer);
}
Step 2: Inspect Network Traffic
- Open browser DevTools > Network tab
- Filter for "b/ss" (Adobe Analytics beacons)
- Check beacon parameters:
- Verify 200 OK response
Step 3: Use Adobe Analytics Debugger
# Install Adobe Experience Cloud Debugger
# Chrome: https://chrome.google.com/webstore/detail/adobe-experience-cloud-de/...
# Firefox: https://addons.mozilla.org/en-US/firefox/addon/adobe-experience-cloud-...
# Check in debugger:
# - Report Suite ID matches expected
# - All variables populated correctly
# - No JavaScript errors blocking tracking
# - Processing rules applying as expected
Step 4: Query Data Warehouse
-- Verify data exists in raw logs
SELECT
pagename,
COUNT(*) as hits,
COUNT(DISTINCT post_visid_high || post_visid_low) as visitors
FROM adobe_analytics_raw.hit_data
WHERE date_time >= CURRENT_DATE - 7
AND report_suite_id = 'prod-rsid'
GROUP BY pagename
ORDER BY hits DESC
LIMIT 20;
Decision Tree
Issue Reported
│
├─ Data Missing Entirely?
│ ├─ YES → Check tracking code deployment
│ │ → Verify report suite ID
│ │ → Check ad blockers/privacy extensions
│ │
│ └─ NO → Data Delayed?
│ ├─ YES → Check processing queue status
│ │ → Review data latency SLA
│ │
│ └─ NO → Data Inaccurate?
│ → Verify processing rules
│ → Check VISTA rules
│ → Review bot filtering settings
Tools & Logs
Adobe Analytics Debugging Tools
1. Adobe Experience Cloud Debugger
- Real-time beacon inspection
- Variable population verification
- Launch/Tags rule debugging
- ECID validation
2. ObservePoint (Third-Party)
- Automated tag auditing
- Multi-page journey validation
- Historical tracking verification
- Compliance monitoring
3. Charles Proxy / Fiddler
4. Adobe Analytics API Explorer
# Python script to query Adobe Analytics API for debugging
import requests
import json
from datetime import datetime, timedelta
class AdobeAnalyticsDebugger:
def __init__(self, client_id, client_secret, company_id):
self.client_id = client_id
self.client_secret = client_secret
self.company_id = company_id
self.access_token = self.get_access_token()
def get_access_token(self):
# OAuth implementation
auth_url = 'https://ims-na1.adobelogin.com/ims/token/v1'
payload = {
'grant_type': 'client_credentials',
'client_id': self.client_id,
'client_secret': self.client_secret,
'scope': 'openid,AdobeID,read_organizations,additional_info.projectedProductContext'
}
response = requests.post(auth_url, data=payload)
return response.json()['access_token']
def check_recent_data(self, rsid, hours=24):
"""Check if data is flowing in last N hours"""
url = f'https://analytics.adobe.io/api/{self.company_id}/reports'
headers = {
'Authorization': f'Bearer {self.access_token}',
'x-api-key': self.client_id,
'x-proxy-global-company-id': self.company_id
}
# Build request for last 24 hours
end_date = datetime.now()
start_date = end_date - timedelta(hours=hours)
request_body = {
'rsid': rsid,
'globalFilters': [{
'type': 'dateRange',
'dateRange': f"{start_date.strftime('%Y-%m-%dT%H:%M:%S')}/{end_date.strftime('%Y-%m-%dT%H:%M:%S')}"
}],
'metricContainer': {
'metrics': [{
'id': 'metrics/pageviews'
}, {
'id': 'metrics/visits'
}]
},
'dimension': 'variables/daterangehour'
}
response = requests.post(url, headers=headers, json=request_body)
data = response.json()
# Analyze for gaps
print(f"Data check for last {hours} hours:")
print(f"Total page views: {data['summaryData']['totals'][0]}")
print(f"Total visits: {data['summaryData']['totals'][1]}")
return data
# Usage
debugger = AdobeAnalyticsDebugger(
client_id='YOUR_CLIENT_ID',
client_secret='YOUR_CLIENT_SECRET',
company_id='YOUR_COMPANY_ID'
)
result = debugger.check_recent_data('your-rsid', hours=24)
Log Access & Retention
Admin Console Logs
- Location: Analytics > Admin > Logs > Usage & Access
- Retention: 2 years
- Contains: Login activity, configuration changes, API calls
Data Feed Logs
- Location: Analytics > Admin > Data Feeds
- Retention: 90 days
- Contains: Feed processing status, error messages
Processing Rules Log
- Location: Analytics > Admin > Report Suites > Edit Settings > General > Processing Rules
- Retention: 30 days
- Contains: Rule execution history
Export Instructions
# Export audit logs via API
curl -X GET \
'https://analytics.adobe.io/api/{COMPANY_ID}/auditlogs/usage?limit=100&page=0' \
-H 'Authorization: Bearer {ACCESS_TOKEN}' \
-H 'x-api-key: {CLIENT_ID}' \
-H 'x-proxy-global-company-id: {COMPANY_ID}' \
> audit_logs.json
Escalation & Communication
When to Escalate
Escalate to Adobe Support When:
- Data processing delays exceed 4 hours
- Widespread data loss (>10% of expected volume)
- API outages affecting critical integrations
- Report suite corruption or configuration errors
- Billing discrepancies for server calls
Escalate to Client/Stakeholders When:
- Data loss impacts published reports
- Tracking implementation changes required
- SLA violations occur
- Regulatory compliance issues identified
Status Update Template
**Incident:** Missing conversion data for Product X
**Status:** Investigating
**Impact:**
- Affected Date Range: 2025-12-25 10:00 AM - 12:00 PM PST
- Affected Metrics: Purchase events, Revenue
- Affected Reports: Conversion funnel, Product performance
- Estimated Data Loss: ~2,000 transactions ($50,000 revenue)
**Root Cause:** Processing rule misconfiguration deployed at 9:45 AM PST
**Resolution Plan:**
1. Revert processing rule to previous version (ETA: 30 mins)
2. Request data reprocessing for affected timeframe (ETA: 24-48 hours)
3. Validate data accuracy post-reprocessing
**Next Update:** 2:00 PM PST or when resolution complete
**Contact:** analytics-team@company.com | Slack: #analytics-incidents
Adobe Support Case Template
Subject: [PRIORITY] Missing Event Data - Report Suite: prod-rsid
Report Suite ID: prod-rsid
Issue Category: Data Collection > Events
Priority: P1 (Business Critical)
Description:
Purchase events (event1) stopped recording at 10:00 AM PST on 2025-12-26.
- Last successful event: 9:58 AM PST
- Expected volume: ~100 events/hour
- Actual volume: 0 events since 10:00 AM
- Verified tracking code firing correctly (see attached debugger screenshots)
- No recent configuration changes
Impact:
- Revenue reporting incomplete
- Executive dashboard unavailable
- Marketing campaign ROI cannot be calculated
Troubleshooting Completed:
1. Verified tracking beacon in browser debugger ✓
2. Checked processing rules - no recent changes ✓
3. Reviewed VISTA rules - no issues found ✓
4. Tested in multiple browsers - same issue ✓
5. API queries return 0 events for timeframe ✓
Attachments:
- debugger_screenshot.png
- network_trace.har
- sample_beacon_request.txt
Requested Action:
- Investigate data processing pipeline for event1
- Confirm if data can be recovered/reprocessed
- Provide ETA for resolution
Preventive Maintenance
Recurring Review Schedule
Weekly
- Review data quality dashboards for anomalies
- Check processing queue for delays
- Monitor server call volume vs. contract limit
- Scan for JavaScript errors in tag management
Monthly
- Audit active processing rules for conflicts
- Review custom traffic variables utilization
- Check for deprecated tracking methods
- Validate cross-domain tracking configuration
Quarterly
- Full implementation audit (QA all tracking)
- Review and update VISTA rules with Adobe
- Bot detection rules assessment
- Data governance policy review
Automated Monitoring
# Automated data quality monitoring script
import requests
import schedule
import time
from datetime import datetime, timedelta
def check_data_quality():
"""Run automated checks on Adobe Analytics data"""
checks = [
check_page_view_volume(),
check_conversion_events(),
check_processing_latency(),
check_variable_population()
]
issues = [c for c in checks if not c['passed']]
if issues:
send_alert(issues)
def check_page_view_volume():
"""Alert if page views drop below threshold"""
# Query API for last hour page views
recent_pv = get_recent_pageviews(hours=1)
baseline = get_baseline_pageviews()
threshold = baseline * 0.7 # 30% drop triggers alert
return {
'check': 'Page View Volume',
'passed': recent_pv >= threshold,
'value': recent_pv,
'threshold': threshold
}
def check_conversion_events():
"""Alert if no conversions in last hour during business hours"""
hour = datetime.now().hour
if 9 <= hour <= 17: # Business hours
recent_conversions = get_recent_conversions(hours=1)
return {
'check': 'Conversion Events',
'passed': recent_conversions > 0,
'value': recent_conversions,
'threshold': 1
}
return {'check': 'Conversion Events', 'passed': True}
def send_alert(issues):
"""Send alerts via Slack/Email"""
message = "Adobe Analytics Data Quality Alert:\n\n"
for issue in issues:
message += f"❌ {issue['check']}: {issue['value']} (threshold: {issue['threshold']})\n"
# Send to Slack
requests.post('https://hooks.slack.com/services/YOUR/WEBHOOK/URL', json={'text': message})
# Schedule checks every 15 minutes
schedule.every(15).minutes.do(check_data_quality)
while True:
schedule.run_pending()
time.sleep(60)
Backlog Items for Risk Reduction
High Priority
- Implement automated regression testing for tracking
- Set up redundant tracking (backup analytics tool)
- Create data recovery runbook
- Build custom alerting dashboard
Medium Priority
- Migrate to server-side tracking for critical events
- Implement A/B testing for tracking changes
- Create synthetic monitoring for key user flows
- Document all custom implementations
Low Priority
- Archive old report suites
- Clean up unused calculated metrics
- Consolidate duplicate processing rules
- Optimize variable usage across report suites