This article and the following posts explains how to automate campaign reporting from multiple channels. As a digital marketer, we run campaigns on platforms like Google, Meta, LinkedIn, Snapchat, etc. Explorting campaign performance metrics like spend, clicks, conversions, per campaign or ad level on a daily basis is a tedious task for many digital marketers. We'll tackle this problem by using tools like API, Python, Snowflake, Airflow, and a BI (Business Intelligence) tool.
As a marketer, you're running online campaigns on Meta, Google, and other media/publishers. Your goal is to automate the reporting where you show numbers like spend, clicks, conversions, on a campaign or ad level, broken down by day, week, or maybe a month. Right now, you are downloading daily/weekly reports from Google, Facebook Ads, etc, copy & pasting it to a huge Google Sheet, and using VLOOKUP or SUMIF functions to put it in an aggregated table. Does it work? Yes. Is it a lot of work? Yes. Let’s automate this tedious process.
To start off, you want to retrieve these reports using APIs from each publisher instead of manual export/download. Then, you transform the data by aggregating the rows by campaign and date. You store this transformed data in a data warehouse. You, then, connect the warehouse into a BI or a spreadsheet, so you have a pretty looking table waiting for you when you wake up with yesterday’s data. Let’s see how we can do that.
We use Python to write a script that combines rows from different sources, like Google and Meta. This script refreshes every day, combining data per campaign, ad, and advertiser. It aggregates KPIs like impressions, clicks, spend, and conversions, attributing them to each campaign. This way, you get a clear view of the performance metrics for each ad and campaign.
Let's clarify some key concepts in this process;
What's tricky here is that advertising platforms like TikTok, Google, and Meta have different column names for conversions, especially when custom conversions are set up. To manage this, you map and standardize these columns, transforming the data into a unified format. All this data is aggregated into a single, large table stored in your data warehouse, such as Snowflake, BigQuery, and DataBrick. The transformation and mapping ensure that conversions recorded by each platform align properly for your reports.
So how do you make this Python scripts run every day? We use Airflow. Airflow is a powerful workflow automation tool that can help orchestrate data pipelines like this to update your Snowflake table. Here's a list of what Airflow can do in this context:
In this setup, Airflow acts as the central manager, ensuring your Python scripts run on time and updating Snowflake consistently.
This is the overview of the first step of reporting automation. In the next article, we’ll cover how to connect a data warehouse to your BI tool so you can create a table and visulizations using the data from the warehouse.