Understanding the full landscape of Google Ads dimensions and metrics is essential for anyone running paid search, display, shopping, video, or Performance Max campaigns. Whether you're building custom dashboards, pulling data through the Google Ads API with GAQL queries, or analyzing reports in the Google Ads interface, knowing exactly what data is available — and what each field means — is the foundation of effective campaign optimization.
This guide provides a complete reference of every dimension and metric available in Google Ads as of 2026. We've organized them by level and category, included the API resource and field names for developers, and added practical context on when and how to use each one. We also cover recent changes that affect how you access and interpret your data.
What Are Google Ads Dimensions vs Metrics?
Before diving into the full reference, it's important to understand the difference between dimensions, metrics, and segments — three concepts that are fundamental to Google Ads reporting but serve different purposes.
Dimensions are descriptive attributes that define what you're looking at. They are the identifiers, settings, and properties that let you organize and filter your data. Examples include campaign name, ad group ID, keyword text, match type, and ad type. In the Google Ads API, dimensions come from resource fields (like campaign.name or ad_group.id). Dimensions answer the question: "How do I want to slice this data?"
Metrics are quantitative measurements that tell you how things performed. They are the numbers: impressions, clicks, cost, conversions, CTR, CPC. In the API, metrics live under the metrics object (like metrics.impressions or metrics.cost_micros). Metrics answer the question: "What happened with my ads?"
Segments are a special type of dimension that you can apply across any resource to break down performance. For example, you can segment your campaign metrics by device, network, day of week, or conversion action to identify which slices drive the best results. In GAQL, segments are specified in the SELECT clause alongside resource fields and metrics.
How Is Google Ads Data Structured?
Google Ads data follows a strict hierarchy: Account > Campaign > Ad Group > Ad / Keyword. Each level inherits configuration from above and adds its own settings. Campaigns define the type (Search, Display, Shopping, Video, PMax), bidding strategy, budget, and network targeting. Ad groups organize keywords, audiences, and default bids. Ads contain the creative (headlines, descriptions, URLs), and keywords define the search queries you bid on.
In the Google Ads API, data is queried using Google Ads Query Language (GAQL) — a SQL-like syntax where you SELECT resource fields, segments, and metrics FROM a specific resource (like campaign, ad_group, or keyword_view). Unlike the Meta API where metrics are returned through a separate Insights endpoint, the Google Ads API returns dimensions and metrics together in a single query. The segments object provides cross-cutting breakdowns comparable to Facebook's breakdown dimensions.
One important difference from other platforms: Google Ads uses micros for all monetary values in the API. A value of 1,500,000 micros equals $1.50 in actual currency. This applies to cost, bids, budgets, and all cost-related metrics. Always divide by 1,000,000 when displaying or calculating with API monetary values.
Campaign-Level Dimensions
Campaign-level dimensions define the top-level structure of your advertising. These fields identify the campaign and its core configuration — campaign type, bidding strategy, budget, network settings, and status. Use these dimensions to organize reporting by campaign and understand the strategic setup behind your results.
| Dimension | API Field | Description |
|---|---|---|
| Campaign ID | campaign.id | Unique identifier for the campaign |
| Campaign Name | campaign.name | The name of the campaign as set by the advertiser |
| Campaign Type | campaign.advertising_channel_type | Primary channel: SEARCH, DISPLAY, SHOPPING, VIDEO, PERFORMANCE_MAX, MULTI_CHANNEL, APP, SMART, HOTEL, LOCAL, DISCOVERY, DEMAND_GEN |
| Campaign Sub-Type | campaign.advertising_channel_sub_type | Specialization within the channel type: SEARCH_MOBILE_APP, DISPLAY_GMAIL, SHOPPING_COMPARISON, VIDEO_ACTION, etc. |
| Bidding Strategy Type | campaign.bidding_strategy_type | Bid strategy: MAXIMIZE_CONVERSIONS, MAXIMIZE_CONVERSION_VALUE, TARGET_CPA, TARGET_ROAS, TARGET_IMPRESSION_SHARE, MANUAL_CPC, MANUAL_CPM, MANUAL_CPV |
| Campaign Budget | campaign_budget.amount_micros | Daily or total campaign budget in micros (divide by 1,000,000 for actual currency value) |
| Budget Delivery Method | campaign_budget.delivery_method | How budget is spent: STANDARD (evenly throughout the day) or ACCELERATED (as fast as possible) |
| Campaign Status | campaign.status | Campaign status: ENABLED, PAUSED, or REMOVED |
| Serving Status | campaign.serving_status | Computed delivery status: SERVING, NONE, ENDED, PENDING, SUSPENDED, or various LIMITED states |
| Start Date | campaign.start_date | Campaign start date in YYYY-MM-DD format |
| End Date | campaign.end_date | Campaign end date in YYYY-MM-DD format (if set) |
| Network Settings - Search | campaign.network_settings.target_google_search | Whether ads appear on Google Search results pages |
| Network Settings - Search Partners | campaign.network_settings.target_search_network | Whether ads appear on Google search partner sites |
| Network Settings - Display | campaign.network_settings.target_content_network | Whether ads appear on the Google Display Network |
| Target CPA | campaign.target_cpa.target_cpa_micros | Target cost per acquisition in micros for Target CPA bidding |
| Target ROAS | campaign.target_roas.target_roas | Target return on ad spend as a ratio (e.g., 3.5 = 350% ROAS) for Target ROAS bidding |
| Optimization Score | campaign.optimization_score | Score from 0% to 100% indicating how well the campaign is set up to perform |
| Labels | campaign.labels | Custom labels applied to the campaign for organization and filtering |
Ad Group-Level Dimensions
Ad group dimensions define the targeting and bidding context within a campaign. Ad groups contain your keywords (for Search), audience signals, and ads. These fields control how your ads are organized, what default bids apply, and the structural relationship between keywords and creative.
| Dimension | API Field | Description |
|---|---|---|
| Ad Group ID | ad_group.id | Unique identifier for the ad group |
| Ad Group Name | ad_group.name | Name of the ad group as defined by the advertiser |
| Ad Group Type | ad_group.type | Type of ad group: SEARCH_STANDARD, SEARCH_DYNAMIC_ADS, DISPLAY_STANDARD, SHOPPING_PRODUCT_ADS, SHOPPING_SHOWCASE_ADS, HOTEL_ADS, VIDEO_BUMPER, VIDEO_TRUE_VIEW_IN_STREAM, etc. |
| Ad Group Status | ad_group.status | Status: ENABLED, PAUSED, or REMOVED |
| CPC Bid | ad_group.cpc_bid_micros | Maximum cost-per-click bid in micros for the ad group |
| CPM Bid | ad_group.cpm_bid_micros | Maximum cost-per-thousand-impressions bid for Display campaigns |
| CPV Bid | ad_group.cpv_bid_micros | Maximum cost-per-view bid in micros for Video campaigns |
| Target CPA Override | ad_group.target_cpa_micros | Ad group-level target CPA override (when different from campaign target) |
| Target ROAS Override | ad_group.target_roas | Ad group-level target ROAS override (when different from campaign target) |
| Target URL | ad_group.final_url_suffix | URL suffix appended to final URLs at the ad group level for tracking |
| Ad Rotation | ad_group.ad_rotation_mode | How ads are rotated: OPTIMIZE (favor best-performing) or ROTATE_INDEFINITELY (equal rotation) |
| Labels | ad_group.labels | Custom labels applied to the ad group for organization and filtering |
Keyword Dimensions
Keyword dimensions describe your search targeting criteria — the words and phrases you bid on, their match types, and their quality diagnostics. These are the most granular targeting dimensions in Search campaigns and directly impact when your ads appear, how much you pay, and where you rank. Quality Score components at the keyword level are among the most important diagnostics in the platform.
| Dimension | API Field | Description |
|---|---|---|
| Keyword Text | ad_group_criterion.keyword.text | The actual keyword text you are bidding on |
| Match Type | ad_group_criterion.keyword.match_type | How the keyword is matched: BROAD, PHRASE, or EXACT |
| Keyword Status | ad_group_criterion.status | Status: ENABLED, PAUSED, or REMOVED |
| Approval Status | ad_group_criterion.approval_status | Policy review status: APPROVED, APPROVED_LIMITED, DISAPPROVED, UNDER_REVIEW |
| Quality Score | ad_group_criterion.quality_info.quality_score | Overall Quality Score from 1 to 10 based on expected CTR, ad relevance, and landing page experience |
| Expected CTR | ad_group_criterion.quality_info.creative_quality_score | Expected click-through rate compared to other advertisers: BELOW_AVERAGE, AVERAGE, ABOVE_AVERAGE |
| Ad Relevance | ad_group_criterion.quality_info.post_click_quality_score | How closely your ad matches the keyword's intent: BELOW_AVERAGE, AVERAGE, ABOVE_AVERAGE |
| Landing Page Experience | ad_group_criterion.quality_info.search_predicted_ctr | Quality and relevance of your landing page: BELOW_AVERAGE, AVERAGE, ABOVE_AVERAGE |
| Search Impression Share | metrics.search_impression_share | Percentage of eligible impressions your keyword received on the Search Network |
| Top Impression % | metrics.search_top_impression_percentage | Percentage of your impressions shown above organic results |
| Absolute Top Impression % | metrics.search_absolute_top_impression_percentage | Percentage of your impressions shown as the very first ad above organic results |
| First Page Bid Estimate | ad_group_criterion.position_estimates.first_page_cpc_micros | Estimated CPC bid needed to show on the first page of search results |
| Top of Page Bid Estimate | ad_group_criterion.position_estimates.top_of_page_cpc_micros | Estimated CPC bid needed to show above organic results |
| First Position Bid Estimate | ad_group_criterion.position_estimates.first_position_cpc_micros | Estimated CPC bid needed to show in the first ad position |
| Final URL | ad_group_criterion.final_urls | Landing page URL(s) set at the keyword level (overrides ad-level URL if set) |
Ad & Creative Dimensions
Ad-level dimensions describe the individual ad units — their format, content, quality indicators, and status. Google Ads supports multiple ad types including Responsive Search Ads, Responsive Display Ads, image ads, video ads, Shopping product ads, and app ads. Understanding these dimensions is critical for creative analysis and optimization.
| Dimension | API Field | Description |
|---|---|---|
| Ad ID | ad_group_ad.ad.id | Unique identifier for the ad |
| Ad Type | ad_group_ad.ad.type | Format: RESPONSIVE_SEARCH_AD, EXPANDED_TEXT_AD (legacy), RESPONSIVE_DISPLAY_AD, IMAGE_AD, VIDEO_AD, SHOPPING_PRODUCT_AD, CALL_AD, APP_AD, DISCOVERY_MULTI_ASSET_AD, DEMAND_GEN_MULTI_ASSET_AD, etc. |
| Ad Status | ad_group_ad.status | Status: ENABLED, PAUSED, or REMOVED |
| Policy Review Status | ad_group_ad.policy_summary.approval_status | Approval status: APPROVED, APPROVED_LIMITED, AREA_OF_INTEREST_ONLY, DISAPPROVED, UNKNOWN |
| Headlines | ad_group_ad.ad.responsive_search_ad.headlines | Up to 15 headline assets (max 30 characters each) for Responsive Search Ads |
| Descriptions | ad_group_ad.ad.responsive_search_ad.descriptions | Up to 4 description assets (max 90 characters each) for Responsive Search Ads |
| Final URL | ad_group_ad.ad.final_urls | Landing page URL(s) the ad directs users to |
| Final Mobile URL | ad_group_ad.ad.final_mobile_urls | Mobile-specific landing page URL(s) if different from desktop |
| Display URL | ad_group_ad.ad.display_url | The URL shown in the ad (may differ from actual landing page) |
| Path 1 | ad_group_ad.ad.responsive_search_ad.path1 | First path field appended to display URL (max 15 characters) |
| Path 2 | ad_group_ad.ad.responsive_search_ad.path2 | Second path field appended to display URL (max 15 characters) |
| Ad Strength | ad_group_ad.ad_strength | Creative quality indicator: EXCELLENT, GOOD, AVERAGE, POOR, UNSPECIFIED — based on headline/description diversity and relevance |
| Tracking URL Template | ad_group_ad.ad.tracking_url_template | URL template for click tracking with ValueTrack parameters |
| Custom Parameters | ad_group_ad.ad.url_custom_parameters | Custom tracking parameters for use in URL templates |
Core Performance Metrics
These are the fundamental metrics that measure how your ads are delivered and interacted with. Every Google Ads advertiser should understand these — they form the basis of all campaign analysis and optimization decisions. All monetary values in the API are returned in micros (multiply by 0.000001 for actual currency value).
| Metric | API Field | Description | Formula / Notes |
|---|---|---|---|
| Impressions | metrics.impressions | Number of times your ads were shown | Counts each time the ad appears on a search result page or a Display/Video placement |
| Clicks | metrics.clicks | Number of clicks on your ads | Includes clicks to your website, click-to-call, app downloads, and map directions |
| CTR | metrics.ctr | Click-through rate | (Clicks ÷ Impressions) × 100 |
| Average CPC | metrics.average_cpc | Average cost per click in micros | Cost ÷ Clicks. Your actual CPC is often lower than your max CPC bid due to Ad Rank |
| Average CPM | metrics.average_cpm | Average cost per 1,000 impressions | (Cost ÷ Impressions) × 1,000. Primarily used for Display and Video campaigns |
| Cost | metrics.cost_micros | Total cost (spend) in micros | Sum of CPC charges, CPM charges, and CPV charges in the period |
| Interactions | metrics.interactions | Number of interactions (clicks for Search, engagements for Video, etc.) | The primary interaction type varies by campaign type |
| Interaction Rate | metrics.interaction_rate | How often people interact after seeing your ad | Interactions ÷ Impressions |
| Search Impression Share | metrics.search_impression_share | Percentage of eligible Search impressions you received | Impressions ÷ Total Eligible Impressions. Values below 10% may not be shown |
| Search Top Impression % | metrics.search_top_impression_percentage | Percentage of impressions shown above organic results | Top Impressions ÷ Total Impressions |
| Search Absolute Top Impression % | metrics.search_absolute_top_impression_percentage | Percentage of impressions in the very first ad position | Absolute Top Impressions ÷ Total Impressions |
Conversion Metrics
Conversion metrics measure the business outcomes driven by your ads — purchases, leads, sign-ups, phone calls, and other valuable actions. Google Ads distinguishes between Conversions (primary conversion actions included in Smart Bidding optimization) and All Conversions (which add secondary actions, cross-device, view-through, and modeled conversions). Understanding this distinction is critical for accurate reporting.
| Metric | API Field | Description | Formula / Notes |
|---|---|---|---|
| Conversions | metrics.conversions | Number of primary conversion actions | Only includes actions marked as "primary" in conversion settings. Uses configured counting (one or every) |
| All Conversions | metrics.all_conversions | Total of all conversion actions including secondary and modeled | Conversions + secondary actions + cross-device + view-through + modeled conversions |
| Conversion Rate | metrics.conversions_from_interactions_rate | How often an interaction leads to a conversion | Conversions ÷ Interactions |
| Cost Per Conversion | metrics.cost_per_conversion | Average cost for each primary conversion | Cost ÷ Conversions |
| Cost Per All Conversions | metrics.cost_per_all_conversions | Average cost for each conversion (all types) | Cost ÷ All Conversions |
| Conversion Value | metrics.conversions_value | Total value of primary conversions | Sum of conversion values as reported by your tracking tags |
| All Conversion Value | metrics.all_conversions_value | Total value of all conversions | Includes secondary, cross-device, view-through, and modeled conversion values |
| Conversion Value / Cost (ROAS) | metrics.conversions_value_per_cost | Return on ad spend from primary conversions | Conversion Value ÷ Cost. A ROAS of 5.0 means $5 revenue per $1 spent |
| Value Per Conversion | metrics.value_per_conversion | Average value of each primary conversion | Conversion Value ÷ Conversions |
| View-Through Conversions | metrics.view_through_conversions | Conversions from people who saw (but did not click) your ad | Counted within the view-through conversion window (default 1 day) |
| Cross-Device Conversions | metrics.cross_device_conversions | Conversions where the click and conversion happened on different devices | Included in All Conversions but separately measurable for cross-device attribution analysis |
| Conversions by Conversion Date | metrics.conversions_by_conversion_date | Conversions attributed to the date the conversion occurred (not the click date) | Useful for comparing with back-end data that uses transaction dates |
Shopping & Product Metrics
Shopping metrics apply to Google Shopping campaigns, Performance Max with product feeds, and free product listings. These dimensions and metrics combine your Merchant Center product data with advertising performance, letting you analyze results at the product level. Essential for e-commerce advertisers optimizing their product feed and bidding.
| Dimension / Metric | API Field | Description |
|---|---|---|
| Product Title | segments.product_title | Product title from your Merchant Center feed |
| Product Type (L1-L5) | segments.product_type_l1 through l5 | Product type hierarchy from your feed (up to 5 levels of categorization) |
| Brand | segments.product_brand | Product brand attribute from your Merchant Center feed |
| Custom Label (0-4) | segments.product_custom_attribute0 through 4 | Custom label values from your feed for grouping products (margin tiers, seasonal tags, priority levels, etc.) |
| Merchant Center ID | segments.product_merchant_id | Google Merchant Center account ID associated with the product |
| Item ID | segments.product_item_id | Unique item ID (SKU) from your product feed |
| Product Condition | segments.product_condition | Condition of the product: NEW, REFURBISHED, or USED |
| Product Channel | segments.product_channel | Sales channel: ONLINE or LOCAL |
| Product Channel Exclusivity | segments.product_channel_exclusivity | Whether the product is sold only online, only locally, or both: SINGLE_CHANNEL or MULTI_CHANNEL |
| Clicks (Product) | metrics.clicks | Clicks on your Shopping product ads |
| Impressions (Product) | metrics.impressions | Times your Shopping product ad was shown |
| CTR (Product) | metrics.ctr | Click-through rate for product-level Shopping ads |
| Conversion Rate (Product) | metrics.conversions_from_interactions_rate | Conversion rate for product-level interactions |
| Benchmark CPC | metrics.search_budget_lost_impression_share | Percentage of Shopping impressions lost due to insufficient budget |
| Product Category (L1-L5) | segments.product_category_l1 through l5 | Google product category from your feed (Google's taxonomy) |
Video & YouTube Metrics
Video metrics measure how users engage with your video ad content on YouTube and across the Google Video Partners network. From view counts and completion rates to earned actions and subscriber growth, these metrics help you understand which videos capture attention and drive downstream engagement. Critical for YouTube advertising and Demand Gen campaigns with video assets.
| Metric | API Field | Description | Formula / Notes |
|---|---|---|---|
| Video Views | metrics.video_views | Number of times your video ad was watched | Counted when someone watches 30 seconds (or the full ad if shorter) or interacts with the ad |
| View Rate | metrics.video_view_rate | Percentage of impressions that resulted in a view | Video Views ÷ Impressions |
| Average CPV | metrics.average_cpv | Average cost per video view in micros | Cost ÷ Video Views |
| Earned Views | metrics.video_views_earned | Organic views of your video or other videos on your channel after the initial ad view | Measures downstream organic engagement earned by the paid ad |
| Earned Subscribers | metrics.earned_subscribes | YouTube channel subscriptions earned from your video ad | Free subscriptions earned after someone views your ad |
| Watch Time (seconds) | metrics.video_quartile_p100_rate | Percentage of impressions where the video was watched to 100% | Full completion rate — the ultimate engagement signal for video ads |
| Video Played to 25% | metrics.video_quartile_p25_rate | Percentage of impressions where the video was watched to 25% | First quartile retention — measures initial hook effectiveness |
| Video Played to 50% | metrics.video_quartile_p50_rate | Percentage of impressions where the video was watched to 50% | Midpoint retention — indicates sustained interest in your message |
| Video Played to 75% | metrics.video_quartile_p75_rate | Percentage of impressions where the video was watched to 75% | Third quartile — viewers who reach this point are highly engaged |
| Video Played to 100% | metrics.video_quartile_p100_rate | Percentage of impressions where the entire video was watched | Full completion rate — the strongest video engagement signal |
| Engagement Rate | metrics.engagement_rate | How often people engage with your video ad after it is shown | Engagements ÷ Impressions. Engagements include clicks on interactive elements and CTAs |
| Engagements | metrics.engagements | Number of engagements with your video ad | Clicks on interactive elements, CTA overlays, cards, and companion banners |
| Earned Likes | metrics.video_likes_earned | Likes earned on your video or channel after the paid ad view | Organic likes driven by your video ad campaign |
| Earned Shares | metrics.video_shares_earned | Shares of your video earned after the paid ad view | Organic shares driven by your video ad campaign |
Quality & Auction Metrics
Quality and auction metrics measure how your ads, keywords, and landing pages compare to competitors in the same auctions. Quality Score is the most important diagnostic in Google Ads — it directly impacts your Ad Rank, ad position, actual CPC, and whether your ad shows at all. Auction insights let you benchmark against specific competitors.
| Metric | API Field | Description | Values / Notes |
|---|---|---|---|
| Quality Score | ad_group_criterion.quality_info.quality_score | Overall quality diagnostic from 1 to 10 | Based on expected CTR, ad relevance, and landing page experience. Only populated for keywords with enough data |
| Expected CTR | ad_group_criterion.quality_info.creative_quality_score | Likelihood of your ad being clicked compared to other ads | BELOW_AVERAGE, AVERAGE, or ABOVE_AVERAGE |
| Ad Relevance | ad_group_criterion.quality_info.post_click_quality_score | How well your ad matches the searcher's intent | BELOW_AVERAGE, AVERAGE, or ABOVE_AVERAGE |
| Landing Page Experience | ad_group_criterion.quality_info.search_predicted_ctr | Relevance and usability of your landing page | BELOW_AVERAGE, AVERAGE, or ABOVE_AVERAGE. Factors: relevance, page speed, mobile-friendliness, navigation |
| Search Impression Share | metrics.search_impression_share | Impressions received vs. estimated total eligible impressions | Expressed as a percentage. Low values indicate missed opportunity from budget or rank |
| Search Lost IS (Budget) | metrics.search_budget_lost_impression_share | Estimated % of impressions lost because your budget was too low | High values mean increasing budget would significantly increase impressions |
| Search Lost IS (Rank) | metrics.search_rank_lost_impression_share | Estimated % of impressions lost because of poor Ad Rank | Improve with higher bids, better Quality Score, or more relevant ads |
| Search Top IS | metrics.search_top_impression_share | Impressions you received in top position vs. estimated eligible top impressions | How often you appear above organic when you're eligible to |
| Search Absolute Top IS | metrics.search_absolute_top_impression_share | Impressions you received in the very first position vs. estimated eligible | How often you appear in the #1 ad slot when eligible |
| Auction Insight — Impression Share | auction_insight.search_impression_share | Competitor's impression share in auctions you both participated in | Available through the Auction Insights report — shows competitive landscape |
| Auction Insight — Overlap Rate | auction_insight.overlap_rate | How often a competitor's ad also appeared when your ad was shown | High overlap = direct competitors for same queries |
| Auction Insight — Outranking Share | auction_insight.outranking_share | How often your ad ranked higher than a competitor's (or appeared when theirs did not) | Combined metric of position above rate + non-overlap |
Audience & Targeting Breakdowns
Audience and targeting breakdowns are segments that let you split your metrics by who saw your ads, where they saw them, and when. These are essential for understanding which audiences, devices, locations, and times drive the best performance — and for making informed bid adjustments and targeting decisions.
Device Breakdown
| Segment | API Field | Values |
|---|---|---|
| Device | segments.device | MOBILE, DESKTOP, TABLET, CONNECTED_TV, OTHER |
Network Breakdown
| Segment | API Field | Values |
|---|---|---|
| Ad Network Type | segments.ad_network_type | SEARCH, SEARCH_PARTNERS, CONTENT (Display), YOUTUBE_SEARCH, YOUTUBE_WATCH, MIXED, UNKNOWN |
Location Breakdowns
| Segment | API Field | Description |
|---|---|---|
| Geographic Target (Country) | geographic_view.country_criterion_id | Country where the ad was shown, using Google's geo criterion IDs |
| Geographic Target (Region) | geographic_view.location_type | State, province, or region where the ad was shown |
| User Location (Country) | user_location_view.country_criterion_id | Country where the user was physically located |
| Location of Interest | user_location_view.targeting_location | Location the user showed interest in (via search query), which may differ from physical location |
Time Breakdowns
| Segment | API Field | Description |
|---|---|---|
| Hour of Day | segments.hour | Hour (0-23) when the impression occurred in the account's timezone |
| Day of Week | segments.day_of_week | MONDAY through SUNDAY — day when the ad was served |
| Date | segments.date | Specific date in YYYY-MM-DD format |
| Week | segments.week | Week of the year (Monday start) in YYYY-MM-DD format for the Monday |
| Month | segments.month | Month in YYYY-MM-01 format |
| Quarter | segments.quarter | Quarter in YYYY-MM-DD format (first day of the quarter) |
Demographic Breakdowns
| Segment / Resource | API Field | Description |
|---|---|---|
| Age Range | ad_group_criterion.age_range.type | AGE_RANGE_18_24, AGE_RANGE_25_34, AGE_RANGE_35_44, AGE_RANGE_45_54, AGE_RANGE_55_64, AGE_RANGE_65_UP, AGE_RANGE_UNDETERMINED |
| Gender | ad_group_criterion.gender.type | MALE, FEMALE, UNDETERMINED |
| Household Income | ad_group_criterion.income_range.type | Top 10%, 11-20%, 21-30%, 31-40%, 41-50%, Lower 50%, UNDETERMINED |
| Parental Status | ad_group_criterion.parental_status.type | PARENT, NOT_A_PARENT, UNDETERMINED |
Audience Segment Breakdowns
| Segment | API Field | Description |
|---|---|---|
| Audience Segment | ad_group_audience_view | Performance data segmented by audience list (remarketing, in-market, affinity, custom, Customer Match) |
| Audience Name | ad_group_criterion.user_list.user_list | Name or ID of the audience list being used for targeting or observation |
Placement Breakdowns
| Segment | API Field | Description |
|---|---|---|
| Placement (Automatic) | group_placement_view.placement | Website, app, or YouTube channel where your Display/Video ad appeared (automatic placements) |
| Placement Type | group_placement_view.placement_type | WEBSITE, MOBILE_APPLICATION, YOUTUBE_VIDEO, YOUTUBE_CHANNEL |
| Placement (Managed) | ad_group_criterion.placement.url | Manually targeted website or YouTube placement URL |
Conversion Action Breakdowns
| Segment | API Field | Description |
|---|---|---|
| Conversion Action | segments.conversion_action | Specific conversion action name (e.g., "Purchase", "Lead Form Submit", "Phone Call") |
| Conversion Action Category | segments.conversion_action_category | Category: PURCHASE, LEAD, PAGE_VIEW, SIGNUP, DOWNLOAD, ADD_TO_CART, BEGIN_CHECKOUT, SUBSCRIBE_PAID, PHONE_CALL_LEAD, IMPORTED_LEAD, etc. |
| Conversion Attribution Model | segments.conversion_attribution_event_type | Attribution model used: DATA_DRIVEN (default), LAST_CLICK, FIRST_CLICK, LINEAR, TIME_DECAY, POSITION_BASED |
| Conversion Lag (Days) | segments.conversion_lag_bucket | Days between the ad interaction and the conversion: LESS_THAN_ONE_DAY, ONE_TO_TWO_DAYS, TWO_TO_THREE_DAYS, etc. up to SIXTY_PLUS_DAYS |
| External Conversion Source | segments.external_conversion_source | Source of the conversion: GOOGLE_PLAY, WEBSITE, FIREBASE, UPLOAD_CALLS, UPLOAD, SALESFORCE, THIRD_PARTY_APP_ANALYTICS, etc. |
| Click Type | segments.click_type | Type of click: URL_CLICKS, CALLS, SITELINKS, GET_DIRECTIONS, APP_DEEPLINK, BREADCRUMBS, etc. |
Search Terms Dimensions
| Dimension | API Field | Description |
|---|---|---|
| Search Term Text | search_term_view.search_term | Actual search query the user typed that triggered your ad |
| Search Term Status | search_term_view.status | Whether the search term is ADDED (as a keyword), EXCLUDED (negative), or NONE |
| Search Term Match Type | segments.search_term_match_type | How the search term matched your keyword: BROAD, PHRASE, EXACT, or NEAR_EXACT/NEAR_PHRASE |
Performance Max Dimensions
Performance Max campaigns use a unique structure with asset groups, listing groups, and AI-driven creative combinations. While PMax provides less granular reporting than standard campaigns, Google has progressively expanded the available dimensions and metrics. These fields are essential for understanding how PMax allocates budget and which creative assets drive performance.
Asset Group Dimensions
| Dimension | API Field | Description |
|---|---|---|
| Asset Group ID | asset_group.id | Unique identifier for the asset group within a PMax campaign |
| Asset Group Name | asset_group.name | Name of the asset group as defined by the advertiser |
| Asset Group Status | asset_group.status | Status: ENABLED, PAUSED, or REMOVED |
| Asset Group Strength | asset_group.ad_strength | Quality indicator: EXCELLENT, GOOD, AVERAGE, POOR — based on asset variety and relevance |
Asset Performance Labels
| Dimension | API Field | Description |
|---|---|---|
| Asset Performance Label | asset_group_asset.performance_label | Individual asset rating: BEST, GOOD, LOW, or LEARNING. Indicates how well each headline, description, image, or video performs relative to others in the group |
| Asset Type | asset_group_asset.field_type | Role of the asset: HEADLINE, DESCRIPTION, LONG_HEADLINE, MARKETING_IMAGE, SQUARE_MARKETING_IMAGE, PORTRAIT_MARKETING_IMAGE, YOUTUBE_VIDEO, LOGO, LANDSCAPE_LOGO, BUSINESS_NAME, CALL_TO_ACTION_SELECTION |
| Asset Text Content | asset.text_asset.text | Actual text content of headline and description assets |
| Asset Image URL | asset.image_asset.full_size.url | URL of image assets used in the asset group |
Listing Group Dimensions
| Dimension | API Field | Description |
|---|---|---|
| Listing Group Filter Type | asset_group_listing_group_filter.type | UNIT_INCLUDED, UNIT_EXCLUDED, or SUBDIVISION — defines how products are grouped in PMax Shopping |
| Listing Group Dimension | asset_group_listing_group_filter.case_value | Product attribute used for grouping: brand, category, condition, custom attribute, item ID, product type |
| Search Theme | campaign_search_term_insight.category_label | Grouped search themes showing what queries triggered your PMax ads (introduced 2025) |
Extension & Asset Metrics
Extensions (now called "assets" in the Google Ads interface) add supplemental information to your ads — additional links, phone numbers, locations, callouts, and structured snippets. Tracking extension performance helps you understand which assets enhance your ads and drive incremental clicks.
| Extension Type | API Resource | Key Metrics Available |
|---|---|---|
| Sitelink Extensions | asset_group_asset / campaign_asset | Clicks, impressions, CTR, conversions — measured per individual sitelink |
| Call Extensions | campaign_asset | Phone calls, call duration, phone-through rate, call conversions |
| Location Extensions | campaign_asset | Clicks for directions, phone calls, location detail expansions |
| Callout Extensions | campaign_asset | Impressions and clicks when callout text was displayed with the ad |
| Structured Snippets | campaign_asset | Impressions and clicks when structured snippet headers and values were shown |
| Price Extensions | campaign_asset | Clicks on individual price items, impressions, CTR |
| Promotion Extensions | campaign_asset | Clicks, impressions — tracks performance of promotional offers shown with ads |
| Lead Form Extensions | campaign_asset | Form opens, form submissions (leads), cost per lead |
| Image Extensions | campaign_asset | Impressions and clicks when images were shown alongside Search ads |
Display Network-Specific Metrics
Display campaigns use a different set of delivery and engagement metrics compared to Search. These metrics measure viewability, engagement with rich media, and audience reach across millions of websites and apps in the Google Display Network.
| Metric | API Field | Description |
|---|---|---|
| Active View Viewable Impressions | metrics.active_view_viewability | Percentage of impressions that were viewable (50%+ of the ad visible for 1+ second) |
| Active View Measurable Impressions | metrics.active_view_measurability | Percentage of impressions where viewability could be measured |
| Active View CTR | metrics.active_view_ctr | CTR calculated only on viewable impressions — more accurate than standard CTR for Display |
| Active View CPM | metrics.active_view_cpm | Cost per 1,000 viewable impressions — the true cost of being seen on Display |
| Gmail Saves | metrics.gmail_saves | Times someone saved your Gmail ad to their inbox |
| Gmail Forwards | metrics.gmail_forwards | Times someone forwarded your Gmail ad to another person |
| Gmail Clicks to Website | metrics.gmail_secondary_clicks | Clicks to your website from within the expanded Gmail ad |
| Content Impression Share | metrics.content_impression_share | Percentage of eligible Display Network impressions you received |
| Content Lost IS (Budget) | metrics.content_budget_lost_impression_share | Percentage of Display impressions lost due to insufficient budget |
| Content Lost IS (Rank) | metrics.content_rank_lost_impression_share | Percentage of Display impressions lost due to poor Ad Rank |
How to Use Google Ads Metrics for Optimization
Having access to 400+ metrics is powerful, but knowing which ones matter for your specific campaign types and goals is what separates effective advertisers from data-overwhelmed ones. Here's a practical framework for selecting the right metrics at each level.
For Search campaigns
Focus on Quality Score and its three components (expected CTR, ad relevance, landing page experience) to diagnose efficiency issues. Track search impression share and search lost IS (budget vs. rank) to understand whether you're missing opportunities due to budget constraints or poor Ad Rank. Monitor top impression % and absolute top impression % for brand campaigns where position matters. Use cost per conversion and ROAS as your primary optimization KPIs.
For Shopping campaigns
Analyze at the product level using product title, brand, and custom labels to identify winners and losers. Track conversion rate and ROAS by product to prioritize bid increases for high-performers and reduce bids on underperformers. Use impression share data to identify products losing visibility to competitors. Monitor benchmark CPC to understand competitive pricing dynamics.
For Video / YouTube campaigns
Use view rate and quartile completion rates (25%, 50%, 75%, 100%) to understand where viewers drop off. A sharp drop between 25% and 50% means your message isn't sustaining interest. Track earned views and earned subscribers to measure organic amplification from paid ads. For action campaigns, conversions and cost per conversion matter more than view rate.
For Performance Max campaigns
Focus on asset group-level conversions and conversion value since PMax offers limited granularity. Use asset performance labels (Best, Good, Low, Learning) to replace underperforming assets. Monitor search themes to understand which queries trigger your PMax ads. Track listing group performance for product-level optimization. Compare PMax performance against standard campaigns by ensuring conversion tracking is consistent across both.
For cross-campaign optimization
Compare cost per conversion and ROAS across campaign types to allocate budget effectively. Use device, location, and time of day segments to identify patterns that apply across campaigns — for example, if mobile converts better on weekends across all campaigns, apply bid adjustments accordingly. Use audience segment breakdowns to understand which audiences perform best across Search, Display, and Video.
What Changed in 2025-2026
Google has made significant changes to its advertising platform over the past two years, affecting both the available data and how campaigns operate. Understanding these changes is critical for anyone working with historical data, building dashboards, or optimizing campaign strategy.
Broad match AI expansion (2025)
Google continued expanding broad match keyword coverage with AI-powered intent matching. Broad match keywords now match queries with the same implied intent even when no individual words overlap. For example, a broad match keyword "affordable housing" might match "low-cost apartments near me." This means the search terms report is more important than ever for monitoring what queries actually trigger your ads. However, search terms visibility was further restricted for low-volume queries, making negative keyword management more challenging.
Performance Max reporting improvements (2025-2026)
Google expanded PMax reporting with asset group-level performance data, asset performance labels (Best, Good, Low, Learning) for individual creative assets, and search themes showing grouped query categories that triggered your ads. Channel-level reporting was added in late 2025, allowing advertisers to see a breakdown of PMax performance across Search, Shopping, Display, YouTube, Discover, and Gmail for the first time.
Enhanced Conversions expansion (2025)
Enhanced Conversions became the default recommended setup for conversion tracking. The feature now supports Enhanced Conversions for leads (matching offline conversion data to ad interactions using hashed first-party data) and Enhanced Conversions for web (improving online conversion measurement accuracy). Google also expanded consent mode v2 integration, requiring advertisers in the EEA to implement consent mode for conversions to be modeled when user consent is not granted.
Deprecations and removals
The average position metric (deprecated in 2019) was fully removed from historical data exports in 2025 — use search top impression % and absolute top impression % instead. Expanded Text Ads can no longer be created (since June 2022) but existing ones still serve and report data. Several Smart campaign features were consolidated into Performance Max. The Similar Audiences feature was fully sunset in 2023, with Google recommending optimized targeting and audience expansion as replacements.
AI-powered features in bidding and targeting (2026)
Google introduced AI Max for Search campaigns, which automatically applies broad match behavior, dynamic search ads functionality, and auto-generated assets to standard Search campaigns. This changes how keywords are matched and which creative is served, making search term analysis and asset performance monitoring even more critical. New brand exclusions for PMax were added to prevent Performance Max from cannibalizing branded Search traffic.
Common Mistakes When Analyzing Google Ads Data
Even experienced advertisers make these mistakes when working with Google Ads metrics. Avoiding them will save you from flawed analyses and poor optimization decisions.
1. Confusing Conversions and All Conversions
Conversions only includes primary conversion actions you've opted into Smart Bidding optimization. All Conversions adds secondary actions, cross-device, view-through, and modeled conversions. Comparing ROAS based on "All Conversions Value" against a target set for "Conversions" will make performance look better than it actually is. Always check which column your reports use.
2. Ignoring search impression share data
A campaign with a 5% conversion rate and 20% impression share is leaving massive opportunity on the table. Always check search lost IS (budget) and search lost IS (rank) to understand whether you need more budget or better Quality Score. These two metrics tell you exactly where to invest optimization effort.
3. Optimizing for CTR without context
A high CTR on irrelevant queries wastes budget. A keyword with 8% CTR but 0% conversion rate is worse than one with 2% CTR and 5% conversion rate. Always analyze CTR alongside conversion rate and cost per conversion. High CTR is only valuable when it drives qualified traffic.
4. Averaging ROAS across campaigns
ROAS is a ratio — averaging it across campaigns gives mathematically incorrect results. A campaign with $100 spend and 10x ROAS ($1,000 revenue) averaged with a campaign with $10,000 spend and 2x ROAS ($20,000 revenue) gives a misleading 6x average. The actual blended ROAS is ($1,000 + $20,000) ÷ ($100 + $10,000) = 2.08x. Always recalculate from summed totals.
5. Not checking match type distribution
With broad match AI expansion, your exact match keywords may be triggering on queries you don't intend. Regularly check the search terms report and segment keyword performance by match type to ensure broad match keywords aren't cannibalizing exact match or phrase match performance with lower-intent queries.
6. Overlooking the micros format in API data
The Google Ads API returns all monetary values in micros — a cost of 1,500,000 means $1.50, not $1.5 million. Forgetting to divide by 1,000,000 when building dashboards or automations leads to absurdly wrong cost figures. The metrics.cost_micros, metrics.average_cpc, metrics.average_cpm, and all bid fields use this format.
7. Treating Quality Score as a real-time metric
Quality Score is a historical diagnostic, not a real-time auction signal. Google uses a more granular, real-time version of quality signals in each auction. The Quality Score you see in your account is updated periodically and represents an aggregate assessment. Don't make panic changes based on a Quality Score drop of 1 point — look at the component signals (expected CTR, ad relevance, landing page) and trends over time.
8. Comparing PMax and standard campaigns without controlling for overlap
Performance Max campaigns compete in the same auctions as your standard Search and Shopping campaigns. Comparing their performance side-by-side without accounting for incremental lift is misleading — PMax may be taking credit for conversions that your standard campaigns would have captured anyway. Use conversion lift experiments and check for branded search query overlap between PMax and your brand Search campaigns.
9. Using conversion data without checking the attribution model
Google Ads defaults to data-driven attribution (DDA), which distributes conversion credit across multiple touchpoints. If you compare data from an account using DDA with one using last-click attribution, the numbers won't be comparable. Always check segments.conversion_attribution_event_type to know which model applies. When switching attribution models, expect a shift in how conversions are distributed across campaigns — top-of-funnel campaigns typically gain credit under DDA while last-click overweights bottom-funnel.
10. Ignoring conversion lag in recent data
Conversions often take days or weeks to be attributed back to the original click. Looking at the last 3-7 days of conversion data and making budget decisions will undercount conversions that haven't been attributed yet. Use the segments.conversion_lag_bucket segment to understand your typical conversion lag, then exclude the most recent days from optimization decisions. For B2B accounts, conversion lag can exceed 30 days — making the last month of data unreliable for decision-making.
Google Ads API: Key GAQL Query Patterns
For developers and analysts working directly with the Google Ads API, understanding common GAQL query patterns is essential for pulling the right data. Here are the most important patterns for the dimensions and metrics covered in this guide.
Basic campaign performance query
To pull campaign-level performance with core metrics, query the campaign resource and include the metrics you need in the SELECT clause. Add a WHERE clause to filter by date range and campaign status. The ORDER BY clause lets you sort results — for example, by cost descending to see your highest-spending campaigns first.
Keyword-level query with Quality Score
For keyword analysis, query the keyword_view resource which joins keyword dimensions with their performance metrics. Include ad_group_criterion.quality_info.quality_score along with the three component scores. Note that Quality Score is only populated for keywords with sufficient impression volume — low-traffic keywords may return null values.
Cross-segment queries
When you add segments like segments.device or segments.ad_network_type to a query, the results are automatically broken down by those segments. This means a campaign that previously returned one row will now return multiple rows — one per segment value. Be aware that some segment combinations are incompatible: the API will return an error if you try to combine segments that cannot be queried together (e.g., certain conversion segments with certain resource types).
Important API limitations
The Google Ads API has several limitations to be aware of. You cannot SELECT fields from incompatible resources in the same query — for example, you cannot get keyword text and campaign budget in a single query from the keyword_view resource. Some metrics like impression share are approximate and may show "--" when the data sample is too small (below the 10% threshold). Historical Quality Score data is only available for dates after the feature was introduced and may have gaps for keywords with insufficient traffic.
Rate limits also apply: the API allows 10,000 operations per day for most developer tokens, and complex queries against large accounts may time out. Use the PARAMETERS clause to set include_drafts and other parameters that affect query scope and performance.
