Carbon Intelligence™ · Connect Your Data
Our platform is designed to integrate seamlessly with every major DSP and ad platform in the global adtech ecosystem. This guide walks you through connecting your advertising platforms to Carbon Intelligence™ — in just a few minutes.

Display & Video 360
Google's programmatic DSP — the richest data source for carbon analysis with exclusive access to connection type, exchange, and environment dimensions.
Available Dimensions
In DV360, go to Reporting → Create Report.
Select the dimensions listed above — include Exchange, Connection Type, and Environment for maximum granularity.
Add Impressions, Clicks, Total Media Cost, and any other KPIs you track.
Run the report and download it as a CSV file.
Drag and drop the CSV into the Carbon Intelligence™ platform — your emissions data will be ready in seconds.
Ensure Python 3.8+ is installed. Run pip install google-api-python-client google-auth.
In Google Cloud Console, create a Service Account with DV360 API access and the Display & Video 360 scope.
Download the Service Account key file and save it securely. You'll reference this path in the script.
Open the script and set your ADVERTISER_ID, KEY_FILE path, and desired date range.
Execute python carbon-intelligence-dv360-export.py. The script will pull data via the DV360 API and generate a CSV ready for upload.
Use cron (Linux/Mac) or Task Scheduler (Windows) to automate weekly data pulls.
In Google Cloud Console, enable the BigQuery Data Transfer Service for your project.
Set up a scheduled transfer from DV360 to BigQuery, selecting the advertiser IDs and dimensions you need.
Create or select a BigQuery dataset to receive the DV360 data. Set the appropriate region and expiration policies.
In Carbon Intelligence™, add a BigQuery data source and authenticate with your Google Cloud credentials.
Select the tables to sync and set the refresh frequency. Data will flow automatically into your carbon dashboard.

Google Ads
Search, Display, Video, Performance Max — the most widely-used advertising platform. Native Google Ads Scripts make automation seamless.
Available Dimensions
Navigate to Reports → Predefined reports in your Google Ads account.
Add Campaign, Device, Geography, Placements, Creative Size, and Ad Format dimensions.
Include Impressions, Clicks, Cost, and any other metrics you track.
Run the report and download it as a CSV file.
Drag and drop the CSV into the Carbon Intelligence™ platform.
Google Ads Scripts run natively inside Google Ads — no local setup required. The script exports data to a Google Sheet, which you can then download or connect directly.
In Google Ads, navigate to Tools & Settings → Scripts.
Click the + button to create a new script and paste in the Carbon Intelligence™ Google Ads export script.
Create a new Google Sheet and paste its URL into the SPREADSHEET_URL variable in the script.
Click Authorize, then Run. The script will populate 9 tabs in your Google Sheet.
Set the script to run weekly under Scripts → Frequency.
Google Sheet tabs created by the script
| Tab Name | Contents |
|---|---|
| CI_Campaigns | Campaign-level data with IDs, types, and bidding strategies |
| CI_Device | Performance breakdown by device type |
| CI_Geo | Geographic performance by country and region |
| CI_Placements | Site and app placement details |
| CI_CreativeSize | Creative dimensions and ad types |
| CI_AdFormat | Search, Display, Video, Discovery breakdown |
| CI_Video | Video quartiles, duration, and view rates |
| CI_Metadata | Account info, date ranges, and export metadata |
| CI_Export_CSV | Combined flat export ready for CSV download |
In Google Cloud Console, enable the BigQuery Data Transfer Service.
Create a Google Ads transfer configuration, linking your Ads account (MCC or individual) to BigQuery.
Select which Google Ads tables to transfer and set the refresh schedule (daily recommended).
In Carbon Intelligence™, add BigQuery as a data source and authenticate.
Map the BigQuery tables to Carbon Intelligence™ dimensions and enable automatic syncing.

Meta Ads
Facebook, Instagram, Audience Network, Messenger — the largest social advertising ecosystem. Unique granularity on placements (Feed, Stories, Reels) and actual impression devices.
Available Dimensions
Go to Meta Ads Manager and select the campaigns you want to analyse.
Click Columns → Customize Columns and add the dimensions listed above.
Use Breakdown → By Delivery to add Device, Platform, Placement, and Impression Device breakdowns.
Click Export → Export Table Data and choose CSV format.
Upload the CSV to Carbon Intelligence™ for instant emissions analysis.
Use Meta's Marketing API v19.0 to programmatically pull campaign data with all available breakdowns.
Go to developers.facebook.com and create a new app with Marketing API access.
Generate a System User Access Token with ads_read permission on the ad accounts you need.
Set your ACCESS_TOKEN, AD_ACCOUNT_ID, and desired date range in the script.
Execute python carbon-intelligence-meta-ads-export.py. The script handles pagination and rate limits automatically.
Schedule the script to run weekly and upload the output CSV to Carbon Intelligence™.
Connect Meta Ads to your data warehouse via an ETL connector for automated, real-time data flow.
Select a connector that supports Meta Ads: Fivetran, Supermetrics, Funnel.io, or Adverity.
Connect your Meta Ads account to the ETL tool using OAuth. Select the ad accounts to sync.
Set up your destination warehouse (BigQuery, Snowflake, Redshift, or Databricks).
Choose the Meta Ads tables to sync (campaign insights, breakdowns, creative reports) and set the sync frequency.
Link your data warehouse to Carbon Intelligence™ and map the Meta Ads tables to carbon dimensions.

The Trade Desk
The leading independent DSP — full programmatic transparency with buy type, supply vendor, and environment dimensions for granular carbon analysis.
Available Dimensions
In The Trade Desk, navigate to Analytics → My Reports.
Click New Report and select the advertiser and date range.
Add the dimensions listed above including Supply Vendor, Buy Type, and Environment. Add Impressions and Cost metrics.
Run the report and download as CSV.
Upload the CSV to Carbon Intelligence™.
Use The Trade Desk's REDS API v3 to programmatically export report data with full dimension granularity.
Obtain your API token from The Trade Desk platform under Settings → API Tokens.
Set your API_TOKEN, PARTNER_ID, and ADVERTISER_ID in the script.
Execute python carbon-intelligence-thetradedesk-export.py. The script queries the REDS API and outputs a CSV.
Automate with a scheduler and upload the output to Carbon Intelligence™.
Connect The Trade Desk's log-level data or use an ETL connector for warehouse-based integration.
Contact your TTD account manager to enable log-level data (LLD) delivery to your cloud storage (S3, GCS, or Azure Blob).
Set up your cloud bucket and configure your data warehouse to ingest the LLD files on a schedule.
Build SQL views or dbt models to aggregate log-level data into the dimensions needed by Carbon Intelligence™.
Link your warehouse to Carbon Intelligence™ and map the aggregated tables to carbon dimensions.

Amazon DSP
Access to Amazon's exclusive inventory — IMDb TV, Twitch, Fire TV, Prime Video Ads. Unique supply source dimension separating Amazon-owned vs. third-party inventory.
Available Dimensions
In Amazon DSP, navigate to Measurement & Reporting → Reports.
Click Create Report and select your advertiser, date range, and report type.
Select all available dimensions including Supply Source and Ad Format. Add Impressions and Total Cost metrics.
Run the report and download as CSV.
Upload the CSV to Carbon Intelligence™.
Use Amazon Ads API to programmatically pull reporting data from Amazon DSP.
Go to the Amazon Ads developer portal and register your application. Request access to the DSP reporting scope.
Configure OAuth 2.0 credentials — you'll need a CLIENT_ID, CLIENT_SECRET, and REFRESH_TOKEN.
Set your credentials, PROFILE_ID, and date range in the script.
Execute python carbon-intelligence-amazon-dsp-export.py. The script requests, polls, and downloads the report automatically.
Automate weekly runs and upload the output CSV to Carbon Intelligence™.
Use Amazon Marketing Cloud (AMC) for the deepest level of Amazon DSP data integration.
Contact your Amazon Ads account team to provision an Amazon Marketing Cloud instance for your advertiser.
In the AMC UI, write SQL queries to aggregate DSP impression-level data by the dimensions you need.
Configure AMC to output query results to an S3 bucket you control.
Load the S3 data into your warehouse (Redshift, Snowflake, BigQuery, or Databricks).
Link your warehouse to Carbon Intelligence™ and map the AMC output tables.
Other Platforms
Xandr, Yahoo DSP, TikTok Ads, Spotify Ads, Pinterest Ads, Snapchat Ads, LinkedIn Ads, and any other advertising platform — Carbon Intelligence™ analyzes any campaign data via CSV import.
Export your campaign data from any DSP or advertising platform as a CSV file. Carbon Intelligence™ correlates your carbon footprint with advertising performance (clicks, conversions, video views, revenue) to prove that carbon reduction drives better results. The more dimensions you include, the more precise your carbon and performance analysis will be.
📊 Impact on calculation precision
📋 How to prepare your CSV
Navigate to your platform's reporting section and create a custom report. Include as many dimensions from the table above as possible.
Your CSV must include at least: Date, Campaign, Impressions, Clicks, Cost, and Country. For full performance analysis, also add Conversions and Video Views. Without country, the energy mix defaults to a global average and loses precision.
Download the report as a CSV file. Ensure the first row contains column headers.
Drag and drop your CSV into the platform. The engine automatically detects columns, normalizes headers, and launches the carbon analysis.
Choose your integration level
Three ways to connect
CSV Upload
~2 min- No technical setup required
- Manual export from DSP UI
- Upload via drag & drop
- Best for one-off analyses
- Ideal for getting started quickly
API Script
~10 min- Automated data extraction
- Pre-built scripts for each DSP
- Schedule weekly or daily runs
- Consistent formatting guaranteed
- Best for ongoing monitoring
Data Warehouse
Real-time- Direct warehouse connection
- Real-time or near-real-time sync
- Maximum data granularity
- Enterprise-grade reliability
- Best for large-scale operations