How to Automate Google Ads Performance Reports
Build automated Google Ads reporting that updates daily without manual work.
Jay Banlasan
The AI Systems Guy
You can automate Google Ads reporting into a dashboard that updates itself every morning. No more logging into Google Ads, exporting CSVs, and pasting into spreadsheets. I set this up for every account I manage and it saves about 2 hours per week per account.
The Google Ads API is verbose, but once you wrap it properly, pulling data becomes a one-liner.
What You Need Before Starting
- A Google Ads developer token (apply through your MCC account)
- OAuth2 credentials from Google Cloud Console
- Python 3.8+ with
google-adsSDK installed - A Google Sheet or SQLite database for storage
Step 1: Install and Configure the SDK
pip install google-ads
Create a google-ads.yaml config file:
developer_token: YOUR_DEV_TOKEN
client_id: YOUR_CLIENT_ID
client_secret: YOUR_CLIENT_SECRET
refresh_token: YOUR_REFRESH_TOKEN
login_customer_id: "1234567890"
Step 2: Build the Data Fetcher
from google.ads.googleads.client import GoogleAdsClient
from datetime import datetime, timedelta
def fetch_google_ads_data(customer_id, date=None):
if not date:
date = (datetime.now() - timedelta(days=1)).strftime("%Y-%m-%d")
client = GoogleAdsClient.load_from_storage("google-ads.yaml")
ga_service = client.get_service("GoogleAdsService")
query = f"""
SELECT
campaign.name,
ad_group.name,
metrics.impressions,
metrics.clicks,
metrics.cost_micros,
metrics.conversions,
metrics.cost_per_conversion,
metrics.click_through_rate,
metrics.average_cpc
FROM ad_group
WHERE segments.date = '{date}'
ORDER BY metrics.cost_micros DESC
"""
rows = []
response = ga_service.search(customer_id=customer_id, query=query)
for row in response:
rows.append({
"date": date,
"campaign": row.campaign.name,
"ad_group": row.ad_group.name,
"impressions": row.metrics.impressions,
"clicks": row.metrics.clicks,
"spend": row.metrics.cost_micros / 1_000_000,
"conversions": row.metrics.conversions,
"cpa": row.metrics.cost_per_conversion / 1_000_000 if row.metrics.cost_per_conversion else 0,
"ctr": row.metrics.click_through_rate * 100,
"avg_cpc": row.metrics.average_cpc / 1_000_000,
})
return rows
Step 3: Store in SQLite
import sqlite3
def store_google_ads(db_path, rows):
conn = sqlite3.connect(db_path)
conn.execute("""CREATE TABLE IF NOT EXISTS google_ads_daily (
id INTEGER PRIMARY KEY, date TEXT, campaign TEXT, ad_group TEXT,
impressions INTEGER, clicks INTEGER, spend REAL, conversions REAL,
cpa REAL, ctr REAL, avg_cpc REAL, fetched_at DATETIME DEFAULT CURRENT_TIMESTAMP
)""")
for row in rows:
conn.execute("""INSERT INTO google_ads_daily
(date, campaign, ad_group, impressions, clicks, spend, conversions, cpa, ctr, avg_cpc)
VALUES (?,?,?,?,?,?,?,?,?,?)""",
(row["date"], row["campaign"], row["ad_group"], row["impressions"],
row["clicks"], row["spend"], row["conversions"], row["cpa"],
row["ctr"], row["avg_cpc"]))
conn.commit()
conn.close()
Step 4: Generate the Summary Report
def daily_summary(db_path, date=None):
if not date:
date = (datetime.now() - timedelta(days=1)).strftime("%Y-%m-%d")
conn = sqlite3.connect(db_path)
summary = conn.execute("""
SELECT
SUM(spend) as total_spend,
SUM(clicks) as total_clicks,
SUM(impressions) as total_impressions,
SUM(conversions) as total_conversions,
CASE WHEN SUM(conversions) > 0 THEN SUM(spend) / SUM(conversions) ELSE 0 END as avg_cpa
FROM google_ads_daily WHERE date = ?
""", (date,)).fetchone()
conn.close()
return {
"date": date,
"spend": round(summary[0] or 0, 2),
"clicks": summary[1] or 0,
"impressions": summary[2] or 0,
"conversions": round(summary[3] or 0, 1),
"avg_cpa": round(summary[4] or 0, 2),
}
Step 5: Schedule and Deliver
# Run at 7:30 AM daily, after data settles
30 7 * * * cd /path/to/project && python3 google_ads_report.py >> /var/log/google_ads.log 2>&1
The script fetches, stores, summarizes, and sends. One cron job handles the entire reporting pipeline.
What to Build Next
Push the summary to Slack or email as a formatted morning brief. Then combine this with your Meta data to create a single cross-platform daily report.
Related Reading
- AI-Powered Reporting That Actually Gets Read - making reports people actually care about
- The Pipeline Architecture - how to think about data pipelines
- Data Flow Architecture for Non-Engineers - simple data flow concepts
Want this system built for your business?
Get a free assessment. We will map every system your business needs and show you the ROI.
Get Your Free Assessment