Understanding the full landscape of Display & Video 360 dimensions and metrics is essential for anyone running programmatic campaigns through Google's demand-side platform. Whether you're building custom reports in DV360, pulling data through the API, or analyzing performance in connected BigQuery datasets, knowing exactly what data is available — and what each field means — is the foundation of effective programmatic optimization.
This guide provides a complete reference of every dimension and metric available in DV360 as of 2026. We've organized them by hierarchy level — campaigns, insertion orders, line items, and creatives — and included practical context on when and how to use each one.
What Are DV360 Dimensions vs Metrics?
Before diving into the full reference, it's important to understand the difference between dimensions and metrics — two concepts that serve fundamentally different purposes in programmatic advertising data.
Dimensions are descriptive attributes that define what you're looking at. They are the labels, categories, and identifiers that let you organize and filter your data. Examples include campaign name, insertion order ID, line item type, creative format, and exchange. Dimensions answer the question: "How do I want to slice this data?"
Metrics are quantitative measurements that tell you how things performed. They are the numbers: impressions, clicks, spend, conversions, viewability rate, video completion rate. Metrics answer the question: "What happened with my programmatic campaigns?"
Breakdowns are cross-cutting dimensions you can apply to any metric to segment performance by criteria like exchange, device type, geo, audience segment, or time of day.
How Is DV360 Data Structured?
DV360 data follows a strict three-level hierarchy: Campaign > Insertion Order (IO) > Line Item. Each level inherits configuration from above and adds its own settings. Campaigns define the overall objective and flight dates. Insertion orders control budget allocation and pacing. Line items handle targeting, bidding, creative assignment, and actual ad delivery. Creatives are the ad assets assigned to line items. Metrics can be queried at any level — line item metrics roll up to IO, campaign, and advertiser totals.
DV360 integrates tightly with Campaign Manager 360 (CM360) for ad serving and conversion tracking. Floodlight activities defined in CM360 are the primary mechanism for tracking conversions in DV360. Understanding this relationship is critical for accurate attribution reporting.
Campaign Dimensions
Campaign-level dimensions define the top-level structure of your programmatic advertising. These fields identify the campaign and its core configuration — objective, budget, flight dates, and frequency management. Use these dimensions to organize reporting by campaign strategy.
| Dimension | Description |
|---|---|
| Campaign ID | Unique numeric identifier for the campaign in DV360 |
| Campaign Name | The name of the campaign as set by the advertiser or agency |
| Campaign Status | Current state: Active, Paused, Archived, or Draft |
| Campaign Budget | Total budget allocated to the campaign across all insertion orders |
| Flight Start Date | The date the campaign is scheduled to begin delivering |
| Flight End Date | The date the campaign is scheduled to stop delivering |
| Campaign Objective | The goal type: Brand Awareness, Online Action, App Install, Offline and In-Store, or Custom |
| Frequency Cap | Maximum number of impressions per user over a specified time period at the campaign level |
Insertion Order Dimensions
Insertion orders (IOs) sit between campaigns and line items. They control budget allocation, pacing strategy, and performance goals for groups of line items. Think of IOs as budget containers — each IO can have its own budget, schedule, and optimization targets independent of the parent campaign.
| Dimension | Description |
|---|---|
| Insertion Order ID | Unique numeric identifier for the IO |
| Insertion Order Name | The name of the insertion order as defined by the buyer |
| IO Budget | Total budget allocated to this insertion order in the campaign currency |
| IO Pacing | How the IO distributes spend over its flight: Even (daily uniform), ASAP (front-loaded), or Flight Ahead |
| IO Flight Start Date | When the IO begins delivering — can differ from the parent campaign flight |
| IO Flight End Date | When the IO stops delivering |
| Performance Goal | The KPI the IO optimizes toward: CPA, CPC, CPM, CPIAVC (viewable), or CTR |
| IO Frequency Cap | Maximum impressions per user at the insertion order level — overrides campaign cap if more restrictive |
Line Item Dimensions
Line items are the workhorses of DV360 — they control every aspect of ad delivery including targeting, bidding, budget, creative assignment, and frequency capping. Each line item has a specific type that determines what kind of inventory it can access: display, video, audio, YouTube, or mobile app install.
| Dimension | Description |
|---|---|
| Line Item ID | Unique numeric identifier for the line item |
| Line Item Name | The name of the line item as defined by the buyer |
| Line Item Type | The inventory type: Display, Video, Audio, YouTube & Partners, Mobile App Install, or TrueView |
| Bid Strategy | How the line item bids in auction: Fixed, Maximize Performance, Maximize Conversions, or Target CPA |
| Line Item Budget | Daily or total budget allocated to the line item |
| Targeting | Audience, contextual, geographic, device, and inventory targeting criteria applied to the line item |
| Creative Assignment | Which creatives are assigned to this line item — can be manual or auto-optimized rotation |
| Line Item Frequency Cap | Maximum impressions per user at the line item level — the most granular frequency control |
| Pacing | Delivery pacing for the line item: Even, ASAP, or Flight Ahead |
Creative Dimensions
Creative dimensions describe the actual ad assets served to users. DV360 supports a wide range of creative formats — from standard display banners to video pre-rolls, native ads, audio spots, and rich media experiences. Understanding creative dimensions is essential for creative performance analysis and format optimization.
| Dimension | Description |
|---|---|
| Creative ID | Unique identifier for the creative asset in DV360 |
| Creative Name | Name of the creative as defined by the advertiser |
| Creative Format | Type of creative: Display, Video, Native, Audio, or Rich Media |
| Creative Size | Pixel dimensions of the creative (e.g., 300x250, 728x90, 160x600, 320x50) |
| Landing Page URL | The destination URL users are directed to when clicking the creative |
| Third-Party Tracking | External impression and click tracking URLs from third-party verification vendors |
| DCM Creative ID | Corresponding creative ID in Campaign Manager 360 when using CM360 as the ad server |
Core Performance Metrics
These are the fundamental metrics that measure how your programmatic ads are delivered and interacted with. Every DV360 buyer should understand these — they form the basis of all campaign analysis and optimization decisions.
| Metric | Description | Formula / Notes |
|---|---|---|
| Impressions | Number of times your ads were served | Counts each ad impression including repeats to the same user |
| Clicks | Number of clicks on your ads | Counts all clicks that direct users to the landing page |
| CTR | Click-through rate | (Clicks ÷ Impressions) × 100 |
| CPC | Average cost per click | Total Media Cost ÷ Clicks |
| CPM | Cost per 1,000 impressions | (Total Media Cost ÷ Impressions) × 1,000 |
| Total Media Cost | Amount paid to publishers for ad inventory | Raw media spend excluding partner fees and data costs |
| Total Cost | Complete cost including media, partner fees, data costs, and platform charges | Total Media Cost + Partner Costs + Data Fees + Platform Fees |
| Billable Impressions | Impressions that were actually billed to the advertiser | May differ from total impressions when using viewability-based billing |
Viewability Metrics
DV360 uses Active View — Google's MRC-accredited viewability measurement technology — to determine whether ads were actually seen by users. Viewability metrics are critical for programmatic buyers because not all served impressions are actually viewable. Optimizing toward viewable inventory improves campaign effectiveness and reduces wasted spend.
| Metric | Description | Formula / Notes |
|---|---|---|
| Active View Viewable Impressions | Impressions that met the MRC viewability standard | Display: 50% of pixels in view for 1 second. Video: 50% of pixels in view for 2 continuous seconds |
| Active View Viewable Rate | Percentage of measurable impressions that were viewable | (Viewable Impressions ÷ Measurable Impressions) × 100 |
| Active View Measurable Impressions | Impressions where viewability could be measured | Some impressions cannot be measured due to technical limitations (cross-domain iframes, etc.) |
| Active View Measurable Rate | Percentage of total impressions that were measurable | (Measurable Impressions ÷ Total Impressions) × 100 |
| Active View Eligible Impressions | Impressions served on inventory that supports Active View measurement | Eligible impressions are a superset of measurable impressions |
| Average Viewable Time | Average duration (in seconds) that viewable impressions remained in view | Higher values indicate more engaged placements — useful for brand awareness campaigns |
Interpreting viewability data: A healthy viewable rate for display is 60-70%+. Video viewability tends to be higher (70-80%+) because video players are typically larger and more prominently placed. If your measurable rate is below 80%, investigate whether certain exchanges or inventory sources have measurement gaps. Use the viewability tier breakdown to see performance across high, medium, and low viewability buckets.
Conversion Metrics
DV360 tracks conversions through Floodlight activities configured in Campaign Manager 360. Every conversion is categorized by attribution type — post-click (user clicked your ad then converted) or post-view (user saw your ad but did not click, then converted within the attribution window). Understanding these metrics is essential for measuring programmatic campaign ROI.
| Metric | Description | Formula / Notes |
|---|---|---|
| Total Conversions | Sum of all post-click and post-view conversions | Includes all Floodlight activities attributed to the campaign |
| Post-Click Conversions | Conversions where the user clicked the ad before converting | Generally considered higher-quality signal than post-view |
| Post-View Conversions | Conversions where the user saw the ad but did not click before converting | Attribution window typically 1-30 days — configured per Floodlight activity |
| CPA (Cost Per Acquisition) | Average cost per conversion | Total Cost ÷ Total Conversions |
| Conversion Rate | Percentage of clicks that resulted in a conversion | (Total Conversions ÷ Clicks) × 100 |
| Floodlight Activities | Individual Floodlight tags that track specific conversion events | Can be broken down by activity name to see performance per conversion type (purchase, sign-up, etc.) |
Attribution windows matter: DV360's default post-view window is typically 30 days and post-click is 30 days, but these are configurable per Floodlight activity. Shorter windows produce fewer conversions but higher confidence. Always verify which attribution windows are active when comparing conversion data across campaigns or platforms.
Video Metrics
DV360 provides comprehensive video metrics that track the full viewing experience from the initial play through completion. These metrics are essential for evaluating video creative performance, understanding audience engagement, and optimizing video line items for maximum impact.
| Metric | Description | Formula / Notes |
|---|---|---|
| Video Starts | Number of times the video began playing | Counts each time the video player initiates playback |
| First Quartile (25%) | Number of times the video played through 25% of its duration | Early engagement indicator — compare to starts for initial drop-off rate |
| Midpoint (50%) | Number of times the video played through 50% of its duration | Core engagement checkpoint — significant drop from 25% indicates content issues |
| Third Quartile (75%) | Number of times the video played through 75% of its duration | Strong engagement signal — users who reach 75% almost always complete |
| Completions | Number of times the video was played to 100% | Full video view — the strongest engagement signal |
| VCR (Video Completion Rate) | Percentage of video starts that resulted in a completion | (Completions ÷ Starts) × 100 |
| Companion Impressions | Impressions of companion ads displayed alongside the video | Companion banners typically appear next to or below the video player |
| Companion Clicks | Clicks on companion ads displayed alongside the video | Tracks engagement with companion creative, not the video itself |
| Skips | Number of times users skipped the video ad | Available for skippable video formats (e.g., TrueView/YouTube) |
| Mutes | Number of times users muted the video | High mute rates may indicate auto-play with sound environments |
| Unmutes | Number of times users unmuted the video | Indicates deliberate engagement — user chose to hear the audio |
| Pauses | Number of times users paused the video | Can indicate interest (pausing to examine content) or annoyance |
| Fullscreens | Number of times users expanded the video to fullscreen | Strong engagement signal — user actively chose a larger viewing experience |
Reading the video funnel: Compare the progression from starts to 25% to 50% to 75% to completion to understand exactly where viewers drop off. A healthy pre-roll VCR is typically 60-80%. If the drop from starts to 25% is steep (over 40%), your creative hook needs improvement. If 75% to completion drops significantly, your call-to-action timing may be too late.
Audience & Targeting Breakdowns
DV360 offers extensive breakdown dimensions that let you segment any metric by audience characteristics, inventory properties, and delivery context. These breakdowns are essential for understanding what drives performance and optimizing targeting, inventory selection, and creative strategy.
| Breakdown | Description |
|---|---|
| Exchange | The ad exchange where the impression was served: Google Ad Exchange, OpenX, Index Exchange, PubMatic, Magnite, etc. |
| Inventory Source | The specific deal, programmatic guaranteed, or open auction inventory source |
| Device | Device category: Desktop, Mobile, Tablet, Connected TV, or Set-Top Box |
| Geo (Country/Region/City) | Geographic location of the user at impression time — available at country, region, city, and postal code levels |
| Browser | User's browser: Chrome, Safari, Firefox, Edge, Samsung Internet, etc. |
| Operating System | User's OS: Windows, macOS, iOS, Android, ChromeOS, Linux, etc. |
| Audience Segment | First-party, third-party, or Google audience segment the user belongs to |
| Day / Time | Day of week and hour of day when the impression was served |
| Environment | Whether the ad was served in an App or Web environment |
| Viewability Tier | Viewability bucket: High (70%+), Medium (40-70%), Low (below 40%) based on historical viewability of the placement |
Using breakdowns effectively: Start with exchange and inventory source breakdowns to understand where your budget is being spent and which supply paths deliver the best performance. Use device and environment breakdowns to optimize bid adjustments — CTV inventory typically commands higher CPMs but delivers stronger brand metrics. Audience segment breakdowns reveal which targeting layers drive the most conversions, helping you allocate budget to the highest-value segments.
Reach & Frequency Metrics
Reach and frequency metrics help you understand how many unique users your campaigns are reaching and how often they see your ads. These metrics are especially important for brand awareness campaigns where controlling frequency and maximizing unduplicated reach are primary objectives.
| Metric | Description | Formula / Notes |
|---|---|---|
| Unique Reach | Estimated number of unique users who saw your ad at least once | Deduplicated count — cannot be summed across date ranges |
| Average Impression Frequency | Average number of times each unique user saw your ad | Impressions ÷ Unique Reach |
| Cookie Reach | Number of unique cookies that received an impression | Cookie-based measurement — may overcount users across devices and browsers |
| Unique Reach by Demo | Unique reach broken down by demographic groups (age, gender) | Uses Google's demographic estimation models — available for verified audiences |
Reach vs. cookie reach: Unique reach uses Google's cross-device models to estimate true user-level reach, while cookie reach counts individual cookies. A single user on two devices with three browsers could appear as six cookies but one unique reach. For accurate audience sizing, always use unique reach over cookie reach.
DV360 Metric Benchmarks by Line Item Type
Understanding what "good" looks like for DV360 metrics depends heavily on the line item type. Display, video, YouTube, audio, and mobile app install line items each have fundamentally different performance characteristics. Here are typical benchmarks to help you evaluate whether your campaigns are performing within normal ranges.
Display line item benchmarks
| Metric | Below Average | Average | Above Average |
|---|---|---|---|
| CTR | < 0.05% | 0.05% - 0.15% | > 0.15% |
| Viewable Rate | < 50% | 50% - 70% | > 70% |
| CPM (Total Cost) | > $12 | $4 - $12 | < $4 |
| Measurable Rate | < 70% | 70% - 85% | > 85% |
Video line item benchmarks
| Metric | Below Average | Average | Above Average |
|---|---|---|---|
| VCR | < 55% | 55% - 75% | > 75% |
| Viewable Rate | < 60% | 60% - 80% | > 80% |
| CPM (Total Cost) | > $25 | $10 - $25 | < $10 |
| CPCV | > $0.08 | $0.03 - $0.08 | < $0.03 |
Important context for benchmarks: These ranges vary significantly by vertical, geography, and inventory quality. Premium publishers and private marketplace deals typically show higher CPMs but better viewability and engagement. Open auction inventory is cheaper but may have lower quality scores. Always benchmark against your own historical performance first, then use industry averages as secondary reference points.
DV360 Reporting API and Data Access
DV360 data can be accessed through several channels, each suited to different use cases. Understanding your options is important for building an efficient reporting infrastructure.
DV360 Reporting UI
The built-in reporting interface supports standard, Instant, and Offline reports. Standard reports run on demand with customizable dimensions, metrics, filters, and date ranges. Instant reports provide quick access to common views. Offline reports are scheduled and delivered to Google Cloud Storage or via email for large datasets.
DV360 API (Display & Video 360 API)
The DV360 API provides programmatic access to campaign management and reporting. The Queries and Reports resources let you create and run custom reports, specifying dimensions, metrics, filters, and date ranges. API reports support all the same dimensions and metrics available in the UI, plus additional metadata fields.
BigQuery Data Transfer
DV360 supports automatic data transfer to Google BigQuery through the BigQuery Data Transfer Service. This creates daily exports of your DV360 data into BigQuery tables, enabling SQL-based analysis, cross-platform data joins, and integration with visualization tools like Looker and Data Studio. BigQuery is the recommended approach for large-scale analysis and custom attribution modeling.
Data freshness and latency
DV360 UI reports are typically available within 2-4 hours of delivery. API reports may take up to 24 hours for complex queries. BigQuery data transfers are processed daily. Conversion data (Floodlight) may take an additional 24-48 hours to finalize due to attribution window processing. Never make optimization decisions based on same-day conversion data — wait at least 48 hours for reliable numbers.
How to Use DV360 Metrics for Campaign Optimization
Having access to dozens of metrics is powerful, but knowing which ones matter for your specific goals is what separates effective programmatic buyers from those drowning in data. Here's a practical framework for selecting the right metrics.
For brand awareness campaigns
Focus on unique reach, average impression frequency, viewable rate, average viewable time, and VCR (for video). Monitor frequency to prevent ad fatigue — generally keep it below 5-7 for display and 3-5 for video. Use the viewability tier breakdown to ensure you're buying inventory that is actually seen.
For performance/conversion campaigns
Prioritize CPA, conversion rate, post-click conversions, and total cost. Break down conversions by Floodlight activity to understand which actions drive the most value. Use exchange and inventory source breakdowns to identify the highest-converting supply paths and shift budget accordingly.
For video campaigns
Track the video completion funnel from starts through quartiles to completions. Use VCR as the headline efficiency metric. Monitor companion impressions and companion clicks to evaluate the impact of companion banners. The skip rate (for skippable formats) tells you whether your creative hooks are strong enough to retain attention.
For viewability optimization
Monitor viewable rate as the primary metric, but also check measurable rate to ensure your data is complete. Use the viewability tier breakdown to identify inventory sources with consistently high viewability. Consider switching to viewable CPM (vCPM) bidding to only pay for impressions that meet the MRC viewability standard.
Key Differences: DV360 vs Other DSPs
Understanding how DV360's metrics compare to other demand-side platforms helps you navigate multi-platform campaigns and avoid misinterpreting data when switching between tools.
DV360 vs The Trade Desk
Both platforms report impressions, clicks, and conversions, but their cost structures differ. DV360 uses "total media cost" and "total cost" while TTD explicitly separates media cost, data cost, and platform fees. DV360's total cost is conceptually equivalent to TTD's total spend. DV360's Floodlight-based conversion tracking through CM360 differs from TTD's proprietary pixel system — cross-platform conversion deduplication requires careful attribution configuration.
DV360 vs Meta Ads
Meta Ads and DV360 measure fundamentally different ecosystems. Meta reports reach as people-based (logged-in users), while DV360 reach is cookie/device-based (less accurate for cross-device). Meta's "link clicks" are not equivalent to DV360's "clicks" — Meta separates link clicks from all clicks (likes, comments, shares), while DV360 clicks are exclusively ad clicks. Conversion attribution models also differ: Meta uses its own event tracking while DV360 uses Floodlight through CM360.
DV360 vs Google Ads
Both are Google products but serve different purposes. Google Ads focuses on search, Shopping, YouTube, and display through the Google Display Network. DV360 provides access to premium programmatic inventory across hundreds of exchanges. Google Ads uses its own conversion tracking while DV360 uses Floodlight through CM360. When running both platforms, ensure Floodlight deduplication is configured to avoid double-counting conversions.
Common Mistakes When Analyzing DV360 Data
Even experienced programmatic buyers make these mistakes when working with DV360 metrics. Avoiding them will save you from flawed analyses and poor optimization decisions.
1. Using total media cost instead of total cost for ROI calculations
Total media cost only captures what you paid publishers for inventory. It excludes partner fees, data costs, and platform charges — which can add 15-30% on top of media cost. Always use total cost when calculating CPA, ROAS, or any efficiency metric. Otherwise, you're systematically understating your true cost per acquisition.
2. Ignoring the viewable rate when evaluating CPM
A $5 CPM with 40% viewability is actually a $12.50 viewable CPM. A $8 CPM with 80% viewability is a $10 viewable CPM. Always calculate effective viewable CPM (CPM ÷ Viewable Rate) to compare inventory sources on an equal footing. The cheapest impression is not always the most cost-effective.
3. Comparing post-view and post-click conversions without context
Post-view conversions are inherently less attributable than post-click conversions. A user who saw a banner ad and then searched for your brand three days later may have converted regardless. When evaluating campaign performance, give more weight to post-click conversions and use post-view conversions as a directional signal, not a definitive metric.
4. Summing reach across time periods
Just like other platforms, DV360 reach is a deduplicated metric. A user who saw your ad in January and February counts as one unique person for Q1 but appears in both monthly counts. Summing monthly unique reach will overstate your actual audience size. Always query reach for the full period you want to measure.
5. Overlooking frequency caps across hierarchy levels
DV360 allows frequency caps at campaign, IO, and line item levels. These caps interact — the most restrictive cap takes precedence. If your campaign cap is 10/week but a line item cap is 3/day, the line item cap may be the binding constraint on some days. Review frequency caps at all levels to understand actual delivery behavior.
6. Averaging VCR or CPA across line items
VCR and CPA are ratios. Averaging them across line items gives mathematically incorrect results. Instead, sum the numerator and denominator separately, then calculate the ratio from the totals. For VCR: total completions ÷ total starts. For CPA: total cost ÷ total conversions.
7. Not accounting for cross-exchange frequency
When your line items buy across multiple exchanges, the same user may be reached through different supply paths. DV360's frequency cap operates at the DV360 level (not per exchange), but if you also run campaigns on other DSPs (like The Trade Desk), there is no cross-platform frequency coordination. Users may receive significantly more impressions than intended. Use reach and frequency reports to monitor actual exposure levels across all exchanges.
What Changed in DV360 in 2024-2026
Google has made significant updates to DV360's reporting and capabilities over the past two years. Understanding these changes is critical for anyone working with historical data or building new reporting infrastructure.
Cookie deprecation and identity changes
Google's evolving approach to third-party cookie deprecation has affected how DV360 measures reach, frequency, and audience targeting. Cookie-based reach metrics are increasingly unreliable as browser restrictions expand. DV360 now emphasizes modeled reach metrics and Google's first-party identity signals. Advertisers should prioritize publisher-authenticated inventory and first-party data strategies for more accurate measurement.
YouTube integration enhancements
DV360's YouTube & Partners line item type has gained additional metrics including earned views, earned subscribers, and YouTube-specific brand lift metrics. YouTube Shorts inventory is now available through DV360 with corresponding format-specific metrics. The integration between DV360 and YouTube continues to deepen, making DV360 the preferred buying platform for programmatic YouTube campaigns.
Privacy-safe measurement
DV360 has expanded its privacy-safe measurement capabilities through Google's Privacy Sandbox APIs. Topics API-based targeting and Attribution Reporting API-based conversion measurement provide alternatives to cookie-based approaches. These privacy-preserving methods produce aggregated and potentially delayed metrics compared to cookie-based measurement — plan your reporting cadence accordingly.
Report Builder improvements
The DV360 Report Builder has been updated with improved filtering, additional cross-dimension capabilities, and faster report generation. New report types include Environmental Impact reports (carbon emissions estimates for ad delivery) and Programmatic Guaranteed performance reports. The UI has been redesigned for easier navigation and report template management.
Audience targeting changes
DV360's audience targeting options have evolved with the deprecation of certain third-party cookie-based audience segments. Google's Topics API provides contextual targeting alternatives, while first-party data activation through Customer Match and Publisher Provided Identifiers has become the recommended approach for audience targeting. New audience reporting dimensions show the percentage of impressions that used cookie-based versus privacy-preserving targeting signals, helping advertisers track their transition to cookieless approaches.
