Understanding the full landscape of Campaign Manager 360 dimensions and metrics is essential for anyone managing digital advertising measurement across multiple channels. Whether you're building custom reports in Report Builder, pulling data through the CM360 API, or analyzing cross-channel performance in connected BigQuery datasets, knowing exactly what data is available — and what each field means — is the foundation of effective campaign measurement.
This guide provides a complete reference of every dimension and metric available in CM360 as of 2026. We've organized them by category — campaigns, sites, placements, ads, creatives, and conversion activities — and included practical context on when and how to use each one.
What Are CM360 Dimensions vs Metrics?
Before diving into the full reference, it's important to understand the difference between dimensions and metrics — two concepts that serve fundamentally different purposes in ad serving data.
Dimensions are descriptive attributes that define what you're looking at. They are the labels, categories, and identifiers that let you organize and filter your data. Examples include campaign name, site, placement, ad type, and creative size. Dimensions answer the question: "How do I want to slice this data?"
Metrics are quantitative measurements that tell you how things performed. They are the numbers: impressions, clicks, conversions, interaction time, viewability rate. Metrics answer the question: "What happened with my ads?"
Breakdowns are cross-cutting dimensions you can apply to any metric to segment performance. For example, you can break down impressions by site, device, browser, or geographic location to understand delivery patterns.
How Is CM360 Data Structured?
CM360 data follows a hierarchy: Advertiser > Campaign > Site > Placement > Ad > Creative. Campaigns organize your media buys. Sites represent publisher properties. Placements define the specific ad slots on each site. Ads connect creatives to placements through ad assignments. Creatives are the actual ad assets. Metrics can be queried at any level and roll up from creative to placement to site to campaign.
Unlike DV360 which is a buying platform, CM360 is an ad server and measurement platform. It tracks delivery and conversions across all your media buys — programmatic (DV360), direct (publisher deals), and other channels. Floodlight conversion tags in CM360 provide the unified conversion data that feeds back to DV360, Google Ads, and Search Ads 360.
Campaign Dimensions
Campaign-level dimensions define the top-level structure of your advertising measurement. These fields identify the campaign and its core configuration — the advertiser, schedule, and billing information. Use these dimensions to organize reporting by media initiative.
| Dimension | Description |
|---|---|
| Campaign ID | Unique numeric identifier for the campaign in CM360 |
| Campaign Name | The name of the campaign as set by the trafficker or advertiser |
| Advertiser | The advertiser account the campaign belongs to — the top-level organizational entity |
| Campaign Start Date | The date the campaign is scheduled to begin delivering |
| Campaign End Date | The date the campaign is scheduled to stop delivering |
| Billing Invoice Code | Custom billing identifier used for financial reconciliation and invoicing |
Site & Placement Dimensions
Sites and placements are the core of CM360's ad serving model. A site represents a publisher property (such as nytimes.com or espn.com), while placements define the specific ad slots on that site — their size, type, and compatibility settings. Understanding these dimensions is critical for publisher-level performance analysis.
| Dimension | Description |
|---|---|
| Site ID | Unique numeric identifier for the site (publisher property) |
| Site Name | Name of the publisher site as configured in CM360 |
| Placement ID | Unique numeric identifier for the placement (specific ad slot) |
| Placement Name | Name of the placement as defined during trafficking |
| Placement Type | Category of the placement: Standard, In-Stream Video, VPAID, Interstitial, or Tracking |
| Placement Size | Pixel dimensions of the placement (e.g., 300x250, 728x90, 160x600, 970x250) |
| Placement Compatibility | Ad format compatibility: Display, Display Interstitial, In-Stream Video, In-Stream Audio |
| Directory Site | The directory site associated with the placement for programmatic inventory matching |
| Package / Roadblock | Grouping of placements into a package (multiple sizes on same page) or roadblock (all placements shown simultaneously) |
Ad & Creative Dimensions
Ad dimensions describe the ad objects that connect creatives to placements, while creative dimensions detail the actual ad assets. CM360 supports a wide range of creative types — from standard display banners to rich media, video, and HTML5 experiences. Understanding these dimensions is essential for creative rotation analysis and format optimization.
| Dimension | Description |
|---|---|
| Ad ID | Unique numeric identifier for the ad object in CM360 |
| Ad Name | Name of the ad as defined during trafficking |
| Ad Type | Category: Standard, Click Tracker, Tracking, Default, or Internal Redirect |
| Creative ID | Unique numeric identifier for the creative asset |
| Creative Name | Name of the creative as defined by the advertiser or agency |
| Creative Type | Format: Display Image, Display Redirect, Rich Media, HTML5 Banner, In-Stream Video, In-Stream Audio, VPAID |
| Creative Size | Pixel dimensions of the creative (e.g., 300x250, 728x90) |
| Creative Pixel Size | Actual rendered pixel dimensions — may differ from placement size for responsive creatives |
| Rendering ID | Identifier for the specific creative rendering — used to track multiple versions of a creative |
| Companion Creative | The companion banner creative paired with an in-stream video ad |
| Rich Media Type | Sub-type for rich media creatives: Expanding, Floating, In-Page, In-Page with Floating, Multi-Floating, Peel-Down |
Core Delivery Metrics
These are the fundamental metrics that measure how your ads are served and interacted with through CM360's ad server. Every CM360 user should understand these — they form the basis of all delivery reporting and publisher performance evaluation.
| Metric | Description | Formula / Notes |
|---|---|---|
| Impressions | Number of times your ads were served by CM360 | Counts each ad server call that returned an ad — the primary delivery metric |
| Clicks | Number of clicks tracked by CM360's click tracker | Records when a user clicks on the ad and is redirected through CM360's click URL |
| CTR (Click-Through Rate) | Percentage of impressions that resulted in a click | (Clicks ÷ Impressions) × 100 |
| Active View Viewable Impressions | Impressions that met the MRC viewability standard | Display: 50% pixels in view for 1 second. Video: 50% pixels in view for 2 seconds |
| Active View Measurable Impressions | Impressions where viewability could be measured | Some environments cannot be measured (certain iframes, non-JavaScript environments) |
| Active View Eligible Impressions | Impressions served on inventory that supports Active View | Superset of measurable — includes impressions where measurement was attempted but failed |
| Active View % Viewable | Percentage of measurable impressions that were viewable | (Viewable Impressions ÷ Measurable Impressions) × 100 |
| Click Rate | Alternative click rate calculation | Used in certain CM360 report types — functionally equivalent to CTR |
Floodlight Conversion Metrics
Floodlight is CM360's conversion tracking system. Floodlight tags placed on your website track user actions and attribute them back to ad impressions and clicks. Conversions are the most important metrics for measuring campaign ROI — they tell you whether your advertising is driving actual business results.
| Metric | Description | Formula / Notes |
|---|---|---|
| Total Conversions | Sum of all click-through, view-through, and cross-environment conversions | Includes all Floodlight activities attributed to the campaign |
| Click-Through Conversions | Conversions where the user clicked the ad before converting | Highest attribution confidence — direct user intent demonstrated |
| View-Through Conversions | Conversions where the user saw the ad but did not click before converting | Lower attribution confidence — user was exposed but did not interact |
| Cross-Environment Conversions | Conversions where the impression and conversion occurred on different devices or browsers | Uses Google's cross-device graph — important for mobile-to-desktop paths |
| Conversion Revenue | Total revenue value from Sales-type Floodlight activities | Only available for Sales tags — Counter tags do not track revenue |
| CPA (Cost Per Acquisition) | Average cost per conversion | Requires cost data to be populated in CM360 (manually or via DV360 integration) |
| Floodlight Configuration | The parent Floodlight configuration containing the conversion activities | Typically one per advertiser — shared across campaigns |
| Floodlight Activity | The specific Floodlight tag that tracked the conversion | Break down by activity to see conversions per event type (purchase, sign-up, etc.) |
Floodlight attribution windows: Default windows are 30 days for click-through and 30 days for view-through, but they are configurable per activity. Shorter windows produce fewer conversions with higher attribution confidence. When comparing conversion data across platforms, always verify which attribution windows are active — a 30-day view-through window in CM360 versus a 1-day view window in Meta Ads will produce dramatically different conversion counts.
Rich Media Metrics
Rich media metrics measure engagement with interactive ad formats — expandable banners, floating ads, in-page interactive units, and HTML5 experiences. These metrics go beyond basic impression and click data to capture the quality and depth of user engagement with your creative content.
| Metric | Description | Formula / Notes |
|---|---|---|
| Rich Media Impressions | Impressions specifically from rich media creative formats | Subset of total impressions — only counts rich media creative types |
| Rich Media Interaction Time | Total time (in seconds) users spent interacting with rich media creatives | Measures active engagement — mouse-over, scrolling, clicking within the unit |
| Rich Media Interactions | Total number of user-initiated interactions with rich media ads | Any intentional engagement: hover, click, swipe, expand, play, etc. |
| Rich Media Expansions | Number of times users expanded an expandable rich media creative | Counts both auto-expand (on hover) and user-initiated expansions depending on configuration |
| Rich Media Video Plays | Number of times a video within a rich media unit started playing | Tracks video playback within interactive ad units (not in-stream video) |
| Rich Media Video Completions | Number of times a video within a rich media unit played to 100% | Full video view within the rich media creative |
| Rich Media Video Pauses | Number of times users paused a video within a rich media unit | Can indicate interest (reviewing content) or friction |
| Custom Variable Counts | Custom interaction counters defined within the rich media creative | Up to 100 custom variables per creative — track specific in-ad interactions like tab clicks, gallery swipes, form starts |
Video Metrics
CM360 provides comprehensive video metrics for in-stream video ads and VPAID creatives. These track the full viewing experience from initial play through completion, including user interactions like muting, pausing, and expanding to fullscreen.
| Metric | Description | Formula / Notes |
|---|---|---|
| Video Plays | Number of times the video ad started playing | Counts each time the video player initiates playback |
| Video Completions | Number of times the video was played to 100% of its duration | Full view — the strongest video engagement signal |
| Video First Quartile | Number of times the video played through 25% of its duration | Early engagement indicator — compare to plays for initial drop-off rate |
| Video Midpoint | Number of times the video played through 50% of its duration | Core engagement checkpoint — significant drop from 25% signals content issues |
| Video Third Quartile | Number of times the video played through 75% of its duration | Strong engagement signal — users reaching 75% almost always complete |
| Video Mutes | Number of times users muted the video ad | High mute rates may indicate auto-play-with-sound environments |
| Video Pauses | Number of times users paused the video ad | Can indicate interest or annoyance depending on context |
| Video Fullscreen | Number of times users expanded the video to fullscreen | Strong engagement signal — user actively chose a larger viewing experience |
| Companion Ad Impressions | Impressions of companion banners displayed alongside in-stream video | Companion ads appear next to or below the video player |
| Companion Clicks | Clicks on companion banner ads | Tracks engagement with the companion creative, not the video itself |
Verification Metrics
Verification metrics ensure your ads are being seen by real users in brand-safe environments. CM360 provides native verification through Active View and integrates with third-party verification vendors for additional coverage. These metrics are critical for maintaining media quality standards.
| Metric | Description | Formula / Notes |
|---|---|---|
| Viewable Impressions | Impressions that met the MRC viewability standard via Active View | Display: 50% pixels in view for 1 second. Video: 50% pixels for 2 seconds |
| In-View Rate | Percentage of measurable impressions that were viewable | (Viewable Impressions ÷ Measurable Impressions) × 100 |
| Time-in-View | Average duration (in seconds) that viewable impressions remained in view | Longer time-in-view correlates with higher ad recall and engagement |
| Measurable Rate | Percentage of total impressions where viewability could be measured | (Measurable Impressions ÷ Total Impressions) × 100 |
| Brand Safety | Classification of the content environment where the ad appeared | Categories based on content adjacency — flags potentially harmful contexts |
| IVT (Invalid Traffic) | Impressions identified as invalid traffic (bot, fraudulent, or otherwise non-human) | Split into General IVT (GIVT — known bots) and Sophisticated IVT (SIVT — harder to detect fraud) |
Verification best practices: A healthy in-view rate for display is 60-70%+. If measurable rate is below 80%, investigate whether certain sites or placements have technical issues preventing measurement. IVT rates above 5% warrant investigation — high IVT on specific sites or placements may indicate fraud. Brand safety flags should be reviewed regularly to ensure your ads are not appearing alongside inappropriate content.
Breakdowns
CM360 offers extensive breakdown dimensions that let you segment any metric by campaign structure, delivery context, and user characteristics. These breakdowns are essential for understanding what drives performance and optimizing your media plan across sites, placements, and audiences.
| Breakdown | Description |
|---|---|
| Advertiser | The advertiser account — useful for multi-advertiser agencies |
| Site | The publisher property where the ad was served |
| Placement | The specific ad slot within a site |
| Creative | The creative asset that was displayed |
| Date | Day, week, or month the impression was served |
| Device | Device category: Desktop, Mobile, Tablet |
| Browser | User's browser: Chrome, Safari, Firefox, Edge, Samsung Internet |
| Operating System | User's OS: Windows, macOS, iOS, Android, ChromeOS, Linux |
| Connection Type | Network connection: Broadband, Mobile (3G/4G/5G), WiFi |
| Country | User's country by ISO code |
| State / Region | State, province, or geographic region within a country |
| DMA (Designated Market Area) | Nielsen DMA zone — US-specific geographic breakdown for local advertising |
| Environment | Whether the ad was served in a web browser or mobile app environment |
CM360 Reporting and Data Access
CM360 provides multiple reporting interfaces and data export options, each suited to different analysis needs. Understanding your options helps you build an efficient reporting workflow.
Report Builder
CM360's Report Builder is the primary interface for creating custom reports. You select dimensions and metrics, apply filters (date range, advertiser, campaign, site), and choose the output format (CSV, Excel, or API-accessible). Reports can be scheduled for daily, weekly, or monthly delivery. Report Builder supports cross-dimension reports (combining site and creative dimensions in the same report) and Floodlight attribution reports with configurable lookback windows.
Standard reports
CM360 includes pre-built standard reports for common use cases: Delivery (impressions, clicks by campaign/site/placement), Floodlight (conversions by activity), Path to Conversion (multi-touch attribution paths), Reach (unique users), and Verification (viewability, brand safety, IVT). Standard reports are a good starting point before building custom reports.
CM360 API (Campaign Manager 360 API)
The CM360 Reporting API provides programmatic access to all report types. The API supports creating, running, and downloading reports with full control over dimensions, metrics, filters, and date ranges. It also provides access to the Conversions API for uploading offline conversion data and the Trafficking API for managing campaign configuration.
BigQuery Data Transfer
CM360 supports automatic data transfer to Google BigQuery through the BigQuery Data Transfer Service. This creates daily exports of delivery, conversion, and rich media data into BigQuery tables. BigQuery integration enables SQL-based analysis, cross-platform data joins (combining CM360 data with GA4, DV360, or CRM data), and advanced attribution modeling. BigQuery is the recommended approach for large-scale cross-channel analysis.
Data freshness and reconciliation
CM360 delivery data (impressions, clicks) is typically available within 4-6 hours. Floodlight conversion data may take 24-48 hours to finalize due to attribution window processing. Rich media interaction data is available within 24 hours. When reconciling CM360 data with publisher reports or DV360 data, expect discrepancies of 5-10% — these are normal due to timing differences, ad blocking, and counting methodology variations.
CM360 Metric Benchmarks by Ad Type
Understanding performance benchmarks helps you evaluate whether your campaigns are performing within normal ranges. These benchmarks vary by industry, geography, and creative quality, but provide useful reference points.
Standard display benchmarks
| Metric | Below Average | Average | Above Average |
|---|---|---|---|
| CTR | < 0.05% | 0.05% - 0.12% | > 0.12% |
| In-View Rate | < 50% | 50% - 65% | > 65% |
| Measurable Rate | < 70% | 70% - 85% | > 85% |
| IVT Rate | > 8% | 3% - 8% | < 3% |
Rich media benchmarks
| Metric | Below Average | Average | Above Average |
|---|---|---|---|
| Interaction Rate | < 1% | 1% - 3% | > 3% |
| Avg. Interaction Time | < 5 seconds | 5 - 15 seconds | > 15 seconds |
| Expansion Rate | < 0.5% | 0.5% - 2% | > 2% |
In-stream video benchmarks
| Metric | Below Average | Average | Above Average |
|---|---|---|---|
| Completion Rate | < 55% | 55% - 75% | > 75% |
| CTR | < 0.1% | 0.1% - 0.5% | > 0.5% |
| In-View Rate | < 60% | 60% - 80% | > 80% |
Benchmarks context: These ranges are approximate industry averages. Premium publisher placements typically perform above average for viewability and engagement but command higher CPMs. Programmatic open auction inventory tends toward average or below average on quality metrics. Always benchmark first against your own historical data, then use industry averages as secondary reference points.
Key Differences: CM360 vs Other Measurement Platforms
Understanding how CM360's metrics compare to other measurement and attribution systems helps you navigate multi-platform reporting and avoid misinterpreting data.
CM360 vs Google Analytics 4
CM360 is an ad server that tracks ad delivery (impressions, clicks) and conversion attribution through Floodlight tags. GA4 is a web analytics platform that tracks website behavior (sessions, page views, events). CM360 impressions are ad server calls; GA4 sessions are website visits. CM360 conversions use Floodlight attribution (click-through/view-through windows); GA4 uses event-based attribution with data-driven modeling. Both systems will report different conversion numbers for the same campaigns because they use different attribution methodologies.
CM360 vs DV360 reporting
CM360 provides measurement-focused data (what was served, what was viewable, what converted). DV360 provides buying-focused data (what was bid on, what was won, what it cost). CM360 impressions include all ad-served impressions regardless of buying source. DV360 impressions only include programmatically purchased inventory. When both platforms report the same campaign, expect CM360 impressions to be equal to or greater than DV360 impressions.
CM360 vs third-party verification vendors
CM360 uses Active View for viewability measurement, while third-party vendors like IAS, DoubleVerify, and MOAT use their own measurement methodologies. Viewability numbers will differ between CM360 and third-party vendors — typically by 5-15%. This is normal and results from different measurement timing, tag loading sequences, and definition nuances. When contractual viewability guarantees are involved, agree in advance on which measurement source will be the source of truth.
How to Use CM360 Metrics for Campaign Measurement
CM360's strength is centralized measurement across all media buys. Here's a practical framework for leveraging its metrics effectively at each stage of campaign analysis.
For delivery verification
Compare CM360 impressions against publisher-reported impressions and DV360/Google Ads impressions to identify discrepancies. Significant gaps indicate trafficking errors, tag implementation issues, or inventory quality problems. Use site and placement breakdowns to pinpoint where discrepancies occur.
For conversion analysis
Break down total conversions by Floodlight activity to understand which actions your ads drive. Separate click-through from view-through conversions to assess attribution quality. Monitor cross-environment conversions to understand multi-device conversion paths. Use CPA by site and placement to identify the most efficient media placements.
For creative optimization
Use rich media interaction time and interactions to evaluate creative engagement quality beyond clicks. Compare video completion rates across different creative executions to identify which storytelling approaches retain attention. Use custom variable counts to understand in-ad behavior like gallery navigation and form engagement.
For media quality assurance
Monitor in-view rate, measurable rate, and IVT rate by site and placement. Flag sites with viewability below 50%, measurability below 70%, or IVT above 5% for review. Use brand safety classifications to ensure your ads appear in appropriate content environments.
Common Mistakes When Analyzing CM360 Data
Even experienced ad operations professionals make these mistakes when working with CM360 data. Avoiding them will ensure accurate reporting and sound optimization decisions.
1. Confusing CM360 and DV360 impression counts
CM360 impressions are ad server counts (the ad was called and served). DV360 impressions are media counts (the bid was won and the ad was delivered). Discrepancies are normal — CM360 may count impressions from multiple buying sources, while DV360 only counts its own buys. Always specify which platform's impressions you're using when reporting.
2. Ignoring Floodlight attribution windows when comparing platforms
CM360 Floodlight default windows (often 30 days click-through, 30 days view-through) are typically much longer than other platforms. Meta Ads defaults to 7-day click and 1-day view. Google Ads defaults to 30-day click. When comparing conversion counts across platforms, normalize to the same attribution windows or explicitly note the differences.
3. Not deduplicating conversions across campaigns
When a user is exposed to ads from multiple CM360 campaigns and converts, the conversion may be attributed to multiple campaigns depending on your deduplication settings. Ensure your Floodlight configuration uses appropriate deduplication rules to avoid double-counting conversions across overlapping campaigns.
4. Treating view-through conversions equally with click-through
View-through conversions have inherently weaker attribution than click-through conversions. A user who saw a display banner and bought your product two weeks later may have converted regardless. Weight click-through conversions more heavily in performance analysis and use view-through as a directional signal.
5. Overlooking cross-environment conversions
Cross-environment conversions (where the ad impression and conversion happen on different devices) represent an increasingly large share of total conversions as users switch between mobile and desktop. Ignoring this metric understates the true impact of mobile advertising. Include cross-environment conversions in your total conversion count.
6. Averaging CPA or conversion rates across placements
CPA and conversion rates are ratios. Averaging them across placements gives mathematically incorrect results. Instead, sum total cost and total conversions separately, then calculate the ratio from the totals. For CPA: total cost ÷ total conversions. For conversion rate: total conversions ÷ total clicks.
7. Not validating Floodlight tag implementation
Floodlight tags can break silently — a website redesign, tag manager update, or consent management change can stop Floodlight from firing without generating errors in CM360. Regularly validate that Floodlight tags are firing correctly by checking the Floodlight activity report for sudden drops in conversion volume. Set up monitoring alerts for activities that drop below expected daily minimums.
What Changed in CM360 in 2024-2026
Google has made significant updates to CM360's capabilities and reporting features. Understanding these changes is critical for maintaining accurate measurement and leveraging new features.
Enhanced Floodlight conversion modeling
CM360 has expanded its conversion modeling capabilities to account for increasingly limited cookie-based tracking. Modeled conversions now appear alongside observed conversions in Floodlight reports, with a clear indicator of which conversions are modeled versus directly observed. The modeling uses machine learning trained on observed conversion patterns to estimate conversions that cannot be directly attributed due to cookie restrictions or consent requirements.
Cross-environment measurement improvements
Cross-environment conversion measurement has been enhanced with better device graph coverage and more accurate cross-device matching. CM360 now provides more granular breakdowns of cross-environment paths — showing not just that a conversion crossed devices, but which device types were involved in the path (e.g., mobile impression → desktop conversion, CTV impression → mobile conversion).
Privacy-safe attribution
CM360 has integrated with Google's Privacy Sandbox Attribution Reporting API for Chrome-based attribution. This provides privacy-preserving conversion measurement that works without third-party cookies. Attribution reports from the Privacy Sandbox are aggregated and may have noise added for privacy protection — metrics may be slightly less precise than cookie-based attribution but maintain directional accuracy.
Verification and IVT detection updates
CM360's invalid traffic detection has been upgraded with improved algorithms for identifying sophisticated invalid traffic (SIVT). New brand safety categories have been added to reflect evolving content concerns. Active View measurement has been updated to better handle modern web technologies including single-page applications, lazy-loaded content, and AMP pages.
Reporting UI improvements
Report Builder has received updates including faster report generation, improved scheduling options, and better cross-dimensional analysis capabilities. New report types include Privacy Impact reports (showing the percentage of data affected by cookie restrictions) and Enhanced Attribution reports with path analysis. The BigQuery Data Transfer has been expanded with more granular log-level data exports.
