Data Export
Export your event data to CSV for external analysis, reporting, and integration with other tools.
Overview
The data export feature allows you to download your raw event data in CSV format. Use this for custom analysis, importing into spreadsheets, or feeding into other analytics platforms.
Exporting Data
Via UI
- Navigate to Events in the sidebar
- Click Export CSV
- Select date range (optional)
- Click Download
- CSV file will download to your device
Via API
GET /api/events/export?projectId={id}&startDate={start}&endDate={end}
Example:
GET /api/events/export?projectId=proj_xyz&startDate=2025-12-01T00:00:00Z&endDate=2025-12-05T23:59:59Z
Response:
CSV file download with event dataUsing cURL
curl -X GET \
"https://api.serla.dev/api/events/export?projectId=proj_xyz&startDate=2025-12-01T00:00:00Z&endDate=2025-12-05T23:59:59Z" \
-H "Authorization: Bearer YOUR_API_KEY" \
-o events.csvCSV Format
Columns
The exported CSV includes the following columns:
- id: Unique event identifier
- eventName: Name of the event
- userId: User identifier (if provided)
- sessionId: Session identifier (if provided)
- metadata: JSON string of event metadata
- revenueAmount: Revenue amount (if applicable)
- revenueCurrency: Currency code (if applicable)
- language: Browser language (e.g., "en-US")
- userAgent: Browser user agent string
- ipAddress: User IP address
- timestamp: When the event occurred
- createdAt: When the event was recorded
Example
id,eventName,userId,sessionId,metadata,revenueAmount,revenueCurrency,language,userAgent,ipAddress,timestamp,createdAt
evt_abc123,page_view,user_123,sess_xyz,"{"page":"/pricing"}",,,en-US,"Mozilla/5.0...",192.168.1.1,2025-12-05T10:30:00Z,2025-12-05T10:30:01Z
evt_abc124,button_click,user_123,sess_xyz,"{"button":"signup"}",,,en-US,"Mozilla/5.0...",192.168.1.1,2025-12-05T10:31:00Z,2025-12-05T10:31:01Z
evt_abc125,purchase,user_123,sess_xyz,"{"plan":"pro"}",99.00,USD,en-US,"Mozilla/5.0...",192.168.1.1,2025-12-05T10:32:00Z,2025-12-05T10:32:01ZUse Cases
Custom Analysis
Import data into Excel, Google Sheets, or R/Python for custom analysis that goes beyond built-in dashboards.
Data Warehousing
Load event data into your data warehouse (Snowflake, BigQuery, Redshift) for centralized analytics.
Machine Learning
Export historical data to train predictive models for user behavior, churn prediction, or recommendation systems.
Compliance & Auditing
Download complete event logs for compliance audits, data retention policies, or regulatory requirements.
Backup & Archive
Create periodic backups of your event data for disaster recovery or long-term archival.
Third-Party Tools
Import data into business intelligence tools like Tableau, Looker, or Power BI for custom visualizations.
Date Range Selection
All Data
Omit startDate and endDate parameters to export all events:
GET /api/events/export?projectId=proj_xyzSpecific Date Range
Provide ISO 8601 timestamps for start and end dates:
GET /api/events/export?projectId=proj_xyz&startDate=2025-12-01T00:00:00Z&endDate=2025-12-31T23:59:59ZLast N Days
Calculate start date programmatically:
# Python example
from datetime import datetime, timedelta
end_date = datetime.utcnow()
start_date = end_date - timedelta(days=30)
url = f"https://api.serla.dev/api/events/export?projectId=proj_xyz&startDate={start_date.isoformat()}Z&endDate={end_date.isoformat()}Z"File Size Considerations
Large Exports
For projects with millions of events:
- Exports may take several minutes to generate
- Files can be hundreds of MB or larger
- Consider breaking exports into smaller date ranges
- Use streaming or batch processing when importing
Chunked Exports
For very large datasets, export in monthly or weekly chunks:
# Export each month separately
for month in range(1, 13):
start = f"2025-{month:02d}-01T00:00:00Z"
end = f"2025-{month:02d}-31T23:59:59Z"
download_export(
projectId="proj_xyz",
startDate=start,
endDate=end,
filename=f"events_2025_{month:02d}.csv"
)Working with Metadata
Parsing JSON Metadata
The metadata column contains JSON strings. Parse them to access individual fields:
Python Example
import pandas as pd
import json
# Read CSV
df = pd.read_csv('events.csv')
# Parse metadata JSON
df['metadata_parsed'] = df['metadata'].apply(
lambda x: json.loads(x) if pd.notna(x) else {}
)
# Extract specific fields
df['page'] = df['metadata_parsed'].apply(lambda x: x.get('page'))
df['utm_source'] = df['metadata_parsed'].apply(lambda x: x.get('utm_source'))
# Analyze
print(df.groupby('utm_source')['eventName'].count())Excel Example
Use Power Query to parse JSON in Excel:
- Import CSV into Excel
- Select metadata column
- Go to Data → From Text/CSV → Transform Data
- Right-click metadata column → Transform → JSON
- Expand JSON fields as new columns
Common Metadata Fields
- page: Current page path
- referrer: Referring URL
- utm_source: Traffic source
- utm_medium: Traffic medium
- utm_campaign: Campaign name
- button: Button identifier
- form: Form identifier
- plan: Subscription plan
Automation
Scheduled Exports
Automate daily/weekly exports with cron jobs or scheduled tasks:
Bash Script Example
#!/bin/bash
# Daily export at 2am
# Add to crontab: 0 2 * * * /path/to/export.sh
PROJECT_ID="proj_xyz"
API_KEY="your_api_key"
DATE=$(date -u +%Y-%m-%d)
curl -X GET \
"https://api.serla.dev/api/events/export?projectId=$PROJECT_ID&startDate=${DATE}T00:00:00Z&endDate=${DATE}T23:59:59Z" \
-H "Authorization: Bearer $API_KEY" \
-o "exports/events_$DATE.csv"
echo "Export completed for $DATE"Python Script Example
import requests
from datetime import datetime, timedelta
import os
def export_events(project_id, api_key, start_date, end_date):
url = f"https://api.serla.dev/api/events/export"
params = {
"projectId": project_id,
"startDate": start_date.isoformat() + "Z",
"endDate": end_date.isoformat() + "Z"
}
headers = {"Authorization": f"Bearer {api_key}"}
response = requests.get(url, params=params, headers=headers)
if response.status_code == 200:
filename = f"events_{start_date.date()}.csv"
with open(filename, 'wb') as f:
f.write(response.content)
print(f"Exported {filename}")
else:
print(f"Export failed: {response.status_code}")
# Export yesterday's data
yesterday = datetime.utcnow() - timedelta(days=1)
start = yesterday.replace(hour=0, minute=0, second=0, microsecond=0)
end = yesterday.replace(hour=23, minute=59, second=59, microsecond=999999)
export_events(
project_id=os.getenv("SERLA_PROJECT_ID"),
api_key=os.getenv("SERLA_API_KEY"),
start_date=start,
end_date=end
)Best Practices
Export Strategy
- Export regularly (daily or weekly) for incremental backups
- Use consistent date ranges for time-series analysis
- Validate exports by checking row counts and date ranges
- Store exports in versioned/dated folders
Performance
- Limit exports to necessary date ranges
- Schedule large exports during off-peak hours
- Use compression (gzip) for storage and transfer
- Consider pagination for very large datasets
Data Privacy
- Exports include IP addresses and user agents
- Handle exported files according to your privacy policy
- Encrypt sensitive exports at rest and in transit
- Implement access controls for exported data
- Comply with GDPR, CCPA, and other data regulations
Data Quality
- Check for null values in critical fields
- Validate JSON metadata is well-formed
- Verify timestamp ranges match expectations
- Monitor for duplicate events (same ID)
Integration Examples
Load into PostgreSQL
COPY events(id, event_name, user_id, session_id, metadata, revenue_amount, revenue_currency, language, user_agent, ip_address, timestamp, created_at)
FROM '/path/to/events.csv'
WITH (FORMAT csv, HEADER true);Load into BigQuery
bq load \
--source_format=CSV \
--skip_leading_rows=1 \
--autodetect \
mydataset.events \
gs://mybucket/events.csvAnalyze with Pandas
import pandas as pd
df = pd.read_csv('events.csv', parse_dates=['timestamp', 'created_at'])
# Event counts by type
print(df['eventName'].value_counts())
# Daily event volume
df.groupby(df['timestamp'].dt.date)['id'].count().plot()
# Top users by event count
print(df.groupby('userId')['id'].count().sort_values(ascending=False).head(10))Limitations
- Rate Limits: Maximum 10 export requests per hour per project
- File Size: Exports larger than 1GB may be split into multiple files
- Retention: Events are available for export based on your plan's retention period
- Format: Only CSV format is currently supported (JSON coming soon)