AppsFlyer is the leading mobile attribution and marketing analytics platform, used by thousands of app developers and marketers to measure the performance of their user acquisition campaigns. Whether you're building custom dashboards, pulling data through the AppsFlyer API, or analyzing campaign performance in the dashboard, understanding the full landscape of available dimensions and metrics is essential for optimizing your mobile growth strategy.
This guide provides a complete reference of every major dimension and metric available in AppsFlyer as of 2026. We've organized them by functional area, included API field names for developers and analysts, and added practical context on when and how to use each one.
How AppsFlyer Data Is Structured
AppsFlyer's data model centers on attribution — connecting every app install and in-app event back to the marketing touchpoint that drove it. Data flows in three stages: (1) a user sees or clicks an ad (impression/click), (2) the user installs and opens the app (install), and (3) the user performs actions within the app (in-app events). Each stage generates data with its own dimensions and metrics.
Attribution uses a last-click model by default — the last ad interaction before install gets credit. For iOS devices with App Tracking Transparency (ATT), AppsFlyer uses a combination of IDFA-based attribution (for opt-in users), probabilistic modeling, and SKAdNetwork data. The attribution window is configurable per media source (default 7 days for clicks, 1 day for views).
All data is organized by app (each app has its own dashboard and API access), and can be broken down by media source (the ad network or partner), campaign, ad set, ad, and channel. This hierarchy mirrors how most ad platforms structure their campaigns.
App and Media Source Dimensions
These dimensions identify the app being measured and the media sources driving installs. Media source is the most important attribution dimension — it tells you which ad network, partner, or owned channel gets credit for each install.
| Dimension | API Field | Description |
|---|---|---|
| App ID | app_id | Unique identifier for the app (bundle ID for iOS, package name for Android) |
| App Name | app_name | Display name of the app |
| Platform | platform | Operating system: ios or android |
| App Version | app_version | Version number of the app installed by the user |
| Media Source | media_source | Ad network or partner attributed to the install (e.g., Facebook Ads, googleadwords_int, tiktokglobal_int) |
| Channel | channel | Marketing channel: paid, organic, owned media, referral |
| Agency | agency | Agency managing the media spend (when applicable) |
| Partner | partner | Integrated partner identifier (AppsFlyer partner ID) |
| Is Retargeting | is_retargeting | Whether the install/event came from a retargeting campaign |
| Attribution Type | attribution_type | How attribution was determined: click, impression (view-through), or organic |
| SDK Version | sdk_version | Version of the AppsFlyer SDK embedded in the app |
Campaign and Ad Dimensions
Campaign dimensions mirror the hierarchy of your ad platform campaigns. They enable granular analysis of which campaigns, ad groups, and individual creatives drive the best installs and downstream conversions.
| Dimension | API Field | Description |
|---|---|---|
| Campaign Name | campaign | Name of the ad campaign (passed from the ad network) |
| Campaign ID | campaign_id | Unique identifier for the campaign in the ad network |
| Ad Set Name | adset | Name of the ad set or ad group within the campaign |
| Ad Set ID | adset_id | Unique identifier for the ad set in the ad network |
| Ad Name | ad | Name of the individual ad or creative |
| Ad ID | ad_id | Unique identifier for the ad in the ad network |
| Ad Type | ad_type | Creative format: banner, interstitial, video, native, playable, rewarded |
| Keywords | keywords | Search keywords that triggered the ad (for search campaigns) |
| Site ID | site_id | Publisher site or app where the ad was displayed |
| Sub-Site ID | sub_site_id | Sub-publisher identifier for networks with multiple publishers |
| Sub Parameters (1-5) | sub_param_1 through sub_param_5 | Custom parameters passed through attribution links for additional segmentation |
| Click/Impression Time | click_time / impression_time | Timestamp of the last ad interaction before install |
| Install Time | install_time | Timestamp when the app was first opened after install |
| Click-to-Install Time | click_to_install_time | Time elapsed between the click and the install (seconds) |
Attribution Dimensions
Attribution dimensions describe how AppsFlyer determined the source of each install — the method, window, and confidence level. These are critical for understanding data quality and the impact of privacy changes on your measurement.
| Dimension | API Field | Description |
|---|---|---|
| Match Type | match_type | Attribution method: id_matching (IDFA/GAID), probabilistic, srn (self-reporting network), or skan |
| Attribution Lookback Window | attribution_lookback | Time window used for attribution (e.g., 7d click, 1d view) |
| Is Primary Attribution | is_primary_attribution | Whether this is the primary attributed source (vs. multi-touch assist) |
| Re-Attribution Window | reattribution_window | Inactivity period before a returning user counts as re-attributed |
| IDFA/GAID | idfa / advertising_id | Device advertising identifier (available when user opts in on iOS, always on Android) |
| ATT Status | att_status | App Tracking Transparency consent: authorized, denied, not_determined, restricted |
| Customer User ID | customer_user_id | Your internal user identifier mapped to the AppsFlyer profile |
| AppsFlyer ID | appsflyer_id | AppsFlyer's unique device identifier |
| Country | country_code | Country of the user based on IP geolocation at install time |
| City | city | City from IP geolocation |
| Device Type | device_type | Specific device model (iPhone 16, Pixel 9, Samsung Galaxy S25) |
| OS Version | os_version | Operating system version (iOS 18.2, Android 15) |
| Carrier | carrier | Mobile carrier network (AT&T, Verizon, T-Mobile, etc.) |
| Language | language | Device language setting |
Core Metrics
These are the fundamental metrics that measure the top of your acquisition funnel — from ad exposure through installation. They answer the basic question: how effectively are your campaigns driving app installs?
| Metric | API Field | Description | Formula / Notes |
|---|---|---|---|
| Impressions | impressions | Number of times your ad was displayed | Reported by the ad network via cost integration |
| Clicks | clicks | Number of clicks on your ads | Tracked by AppsFlyer attribution links |
| Click-Through Rate (CTR) | ctr | Percentage of impressions that resulted in a click | (Clicks ÷ Impressions) × 100 |
| Installs | installs | Total attributed (non-organic) installs | First app open after download, attributed to a media source |
| Organic Installs | organic_installs | Installs with no attributed media source | App Store search, word of mouth, or unmeasured channels |
| Total Installs | total_installs | Sum of attributed and organic installs | Attributed Installs + Organic Installs |
| Click-to-Install Rate (CTI) | conversion_rate | Percentage of clicks that resulted in an install | (Installs ÷ Clicks) × 100 |
| Impression-to-Install Rate | impression_conversion_rate | Percentage of impressions that resulted in an install | (Installs ÷ Impressions) × 100 |
| Re-Attributions | reattributions | Lapsed users who returned through a paid touchpoint | Users inactive for the re-attribution window who re-engaged |
| Re-Engagements | reengagements | Existing users who interacted with a retargeting ad | Users who still have the app and re-opened it via a retargeting campaign |
Conversion Metrics (In-App Events)
In-app event metrics measure what users do after installing your app. These are the most important metrics for understanding the quality of installs from each media source — installs are only valuable if they lead to meaningful user actions.
| Metric | API Field | Description | Formula / Notes |
|---|---|---|---|
| Total Events | event_counter | Total count of in-app events (all types combined) | Sum of all tracked event occurrences |
| Unique Users (Events) | unique_users | Unique users who performed at least one in-app event | Deduplicated by AppsFlyer ID |
| Purchases | af_purchase | In-app purchase events | Standard event with revenue and currency parameters |
| Revenue | event_revenue | Total revenue from in-app events with monetary value | Sum of revenue parameters across all revenue-generating events |
| Revenue Per Install | arpu | Average revenue per install (ARPU) | Total Revenue ÷ Total Installs |
| Lifetime Value (LTV) | ltv | Cumulative revenue attributed to each install over time | Revenue accumulated since install, attributed to the original media source |
| Registrations | af_complete_registration | Account registration or signup completions | Standard event for measuring registration funnel |
| Subscriptions | af_subscribe | Subscription events (free trial or paid) | Often includes trial_start and subscribe as separate events |
| Tutorial Completion | af_tutorial_completion | Users who completed the app onboarding tutorial | Key activation metric for user quality |
| Level Achieved | af_level_achieved | Game level or milestone reached | Common for gaming apps — indicates engagement depth |
| Add to Cart | af_add_to_cart | Items added to shopping cart in e-commerce apps | Mid-funnel conversion event |
| Content View | af_content_view | Product or content page views within the app | Top-of-funnel engagement event |
Retention Metrics
Retention metrics measure how well your app keeps users coming back after they install. These are cohort-based metrics — they track groups of users who installed on the same day or from the same source and measure their return rates over time.
| Metric | Description | Formula / Notes |
|---|---|---|
| Day 1 Retention | Percentage of users who opened the app the day after install | (Users active on Day 1 ÷ Total Installs in cohort) × 100 |
| Day 3 Retention | Percentage of users who opened the app 3 days after install | Key early retention indicator for app quality |
| Day 7 Retention | Percentage of users who opened the app 7 days after install | Primary benchmark for mobile app retention |
| Day 14 Retention | Percentage of users who opened the app 14 days after install | Mid-term retention indicator |
| Day 30 Retention | Percentage of users who opened the app 30 days after install | Long-term retention — strong indicator of sustainable growth |
| Rolling Retention | Percentage of users active on Day N or any day after | Less strict than classic retention — counts anyone still active |
| Uninstalls | Number of users who uninstalled the app | Detected via silent push notifications (24-48 hour delay) |
| Uninstall Rate | Percentage of installs that resulted in uninstall | (Uninstalls ÷ Installs) × 100 within a time window |
Cohort Metrics
Cohort metrics track groups of users over time to measure cumulative performance from each acquisition source. Unlike point-in-time metrics, cohort analysis shows how user value develops over days, weeks, and months — essential for calculating true ROI and payback periods.
| Metric | Description | Formula / Notes |
|---|---|---|
| Cumulative Revenue | Total revenue generated by the cohort through Day N | Running total of revenue from install date to Day N |
| Cumulative ROAS | Return on ad spend by Day N for the cohort | (Cumulative Revenue through Day N ÷ Ad Spend) × 100 |
| Cumulative Events | Total in-app events performed by the cohort through Day N | Running total of specific event counts |
| Revenue Per User (Day N) | Average revenue per user in the cohort by Day N | Cumulative Revenue ÷ Cohort Size |
| Sessions per User (Day N) | Average app sessions per user by Day N | Total sessions ÷ Cohort Size through Day N |
| Payback Period | Number of days until cumulative revenue exceeds ad spend | The day when cumulative ROAS crosses 100% |
SKAdNetwork Dimensions
SKAdNetwork (SKAN) is Apple's privacy-preserving attribution framework for iOS. It provides limited but privacy-compliant attribution data without exposing user-level identifiers. Understanding SKAN dimensions is critical for iOS measurement post-ATT.
| Dimension | Description |
|---|---|
| SKAN Campaign ID | Numeric campaign identifier (limited values per network) |
| SKAN Conversion Value | 6-bit value (0-63) encoding post-install behavior |
| SKAN Coarse Conversion Value | High, medium, or low (SKAN 4.0+ for privacy thresholds) |
| SKAN Source Domain | Website domain for web-to-app attribution (SKAN 4.0+) |
| SKAN Source App ID | Identifier of the app where the ad was shown |
| SKAN Postback Window | Which measurement window: first (0-2 days), second (3-7 days), third (8-35 days) |
| SKAN Network ID | Ad network identifier registered with Apple |
| SKAN Version | SKAdNetwork API version: 2.2, 3.0, 4.0 |
| Decoded Event | AppsFlyer's decoded interpretation of the conversion value into meaningful events |
| Decoded Revenue Range | Revenue bracket estimated from the conversion value mapping |
Fraud Metrics (Protect360)
Protect360 is AppsFlyer's fraud detection and prevention suite. These metrics help you identify fraudulent installs and events, block them in real-time, and measure the financial impact of fraud on your campaigns.
| Metric | Description | Formula / Notes |
|---|---|---|
| Blocked Installs | Installs flagged and blocked before attribution | Real-time blocking prevents fraudulent sources from getting credit |
| Blocked Install Rate | Percentage of total install attempts that were blocked | (Blocked Installs ÷ Total Install Attempts) × 100 |
| Post-Attribution Fraud | Installs initially attributed but later detected as fraud | Detected through behavioral analysis after install |
| Fraud Type | Classification of fraud: bots, click flooding, click injection, device farms, SDK spoofing | Each type has different detection signals and remediation |
| Fraud Rate by Source | Percentage of installs from each media source flagged as fraud | Identifies problematic networks and sub-publishers |
| Blocked In-App Events | In-app events from fraudulent installs that were blocked | Prevents revenue inflation from fake events |
| Validation Rules Triggered | Custom fraud rules that flagged suspicious activity | User-defined rules based on CTIT, geo mismatch, device patterns |
| Estimated Savings | Estimated budget saved by blocking fraudulent installs | Blocked Installs × Average CPI for the media source |
Cost Metrics
Cost metrics connect your ad spend to AppsFlyer's attribution data, enabling ROI calculations at every level of the campaign hierarchy. Cost data is imported through integrations with ad networks or uploaded manually.
| Metric | API Field | Description | Formula / Notes |
|---|---|---|---|
| Total Spend | cost | Total ad spend for the campaign or media source | Imported from ad network APIs or uploaded via cost ingestion |
| Cost Per Install (CPI) | cpi | Average cost to acquire one install | Total Spend ÷ Installs |
| Cost Per Action (CPA) | cpa | Average cost per in-app event (configurable event) | Total Spend ÷ Target Event Count |
| Cost Per Click (CPC) | average_cpc | Average cost per ad click | Total Spend ÷ Clicks |
| eCPM | ecpm | Effective cost per thousand impressions | (Total Spend ÷ Impressions) × 1,000 |
| ROAS | roi | Return on ad spend | (Total Revenue ÷ Total Spend) × 100 |
| ROI | roi | Return on investment | ((Revenue - Spend) ÷ Spend) × 100 |
How to Use AppsFlyer Metrics for Optimization
Knowing which metrics to prioritize at each stage of your app growth strategy is essential. Here is a practical framework for selecting the right metrics.
For user acquisition
Focus on installs, CPI, and CTI rate as your top-of-funnel metrics. Compare CPI across media sources to identify the most cost-efficient channels. Use CTI rate to evaluate creative and targeting quality — a low CTI often indicates poor ad-to-app-store alignment. Monitor organic uplift (the ratio of organic to paid installs) to measure brand awareness generated by paid campaigns.
For user quality assessment
Look beyond installs to Day 1 and Day 7 retention, registration rate, and purchase rate by media source. A source with cheap installs but 5% Day 7 retention is worse than one with expensive installs but 25% retention. Use cohort ROAS at Day 7, Day 14, and Day 30 to understand how quickly each source pays back its acquisition cost.
For revenue optimization
Track ROAS, LTV, and revenue per install by media source and campaign. Use cohort analysis to project long-term revenue from current installs. Compare payback period across sources to allocate budget toward channels that generate sustainable returns, not just cheap installs.
For fraud prevention
Monitor blocked install rate and fraud rate by source daily. Investigate sources with abnormally high CTI rates (>25% is suspicious for display campaigns), short click-to-install times (<10 seconds indicates click injection), or low retention (<2% Day 1 suggests bots). Use validation rules to create custom fraud detection based on your specific patterns.
iOS Privacy and Measurement Challenges
Apple's App Tracking Transparency (ATT) framework, introduced with iOS 14.5, has fundamentally changed mobile attribution. Understanding the impact on your metrics is critical for accurate iOS measurement.
ATT opt-in rates average 30-40% globally, meaning 60-70% of iOS users cannot be attributed using deterministic IDFA matching. For non-opted-in users, AppsFlyer uses a combination of probabilistic modeling and SKAdNetwork data. This means iOS install counts may be less precise than Android, and some installs that should be attributed to paid sources appear as organic.
SKAdNetwork limitations: SKAN provides aggregated, delayed data (24-48 hours) with limited conversion value resolution (0-63). You cannot get user-level data from SKAN, and campaign ID mapping is limited. AppsFlyer's SKAN solution decodes conversion values and provides estimated metrics, but the data is inherently less granular than traditional attribution.
Common Mistakes When Analyzing AppsFlyer Data
Even experienced mobile marketers make these mistakes. Avoiding them will lead to more accurate measurement and better acquisition decisions.
1. Optimizing for CPI instead of downstream value
A media source with $1 CPI but zero purchases is worse than one with $5 CPI and strong conversion rates. Always optimize for CPA (cost per target event) or ROAS rather than raw install cost. Use cohort analysis to understand the true value of installs from each source.
2. Ignoring organic cannibalization
Some paid campaigns claim credit for installs that would have happened organically. Monitor your organic install rate when scaling paid campaigns — if organic installs drop as paid installs rise, you may be paying for users you would have gotten for free. Use incrementality testing to measure true lift.
3. Comparing iOS and Android metrics directly
iOS attribution is fundamentally different post-ATT. iOS install counts include more modeling uncertainty, retention data may be less complete, and cost metrics may be less accurate. Compare iOS and Android metrics within their own context, not against each other.
4. Not accounting for re-attribution windows
If your re-attribution window is 90 days, a user who uninstalled 89 days ago and reinstalls through a retargeting ad counts as a re-attribution — not a new install. But if they reinstall after 91 days, it counts as a new install. This setting significantly affects install counts and cost calculations.
5. Trusting all cost data equally
Cost data quality varies by ad network. Self-reporting networks (Facebook, Google, TikTok) provide their own cost data which may not perfectly align with AppsFlyer's attributed installs. Smaller networks may have delayed or incomplete cost reporting. Always verify cost data accuracy before making major budget decisions.
6. Ignoring fraud signals
Mobile ad fraud remains significant. If a media source shows unusually high CTI rates, extremely short click-to-install times, or near-zero retention, these are strong fraud indicators. Don't wait for Protect360 to flag everything — actively monitor these signals and set up validation rules for proactive detection.
