AppsFlyer is the leading mobile attribution and marketing analytics platform, used by thousands of app developers and marketers to measure the performance of their user acquisition campaigns. Whether you're building custom dashboards, pulling data through the AppsFlyer API, or analyzing campaign performance in the dashboard, understanding the full landscape of available dimensions and metrics is essential for optimizing your mobile growth strategy.

This guide provides a complete reference of every major dimension and metric available in AppsFlyer as of 2026. We've organized them by functional area, included API field names for developers and analysts, and added practical context on when and how to use each one.

How AppsFlyer Data Is Structured

AppsFlyer's data model centers on attribution — connecting every app install and in-app event back to the marketing touchpoint that drove it. Data flows in three stages: (1) a user sees or clicks an ad (impression/click), (2) the user installs and opens the app (install), and (3) the user performs actions within the app (in-app events). Each stage generates data with its own dimensions and metrics.

Attribution uses a last-click model by default — the last ad interaction before install gets credit. For iOS devices with App Tracking Transparency (ATT), AppsFlyer uses a combination of IDFA-based attribution (for opt-in users), probabilistic modeling, and SKAdNetwork data. The attribution window is configurable per media source (default 7 days for clicks, 1 day for views).

All data is organized by app (each app has its own dashboard and API access), and can be broken down by media source (the ad network or partner), campaign, ad set, ad, and channel. This hierarchy mirrors how most ad platforms structure their campaigns.

App and Media Source Dimensions

These dimensions identify the app being measured and the media sources driving installs. Media source is the most important attribution dimension — it tells you which ad network, partner, or owned channel gets credit for each install.

DimensionAPI FieldDescription
App IDapp_idUnique identifier for the app (bundle ID for iOS, package name for Android)
App Nameapp_nameDisplay name of the app
PlatformplatformOperating system: ios or android
App Versionapp_versionVersion number of the app installed by the user
Media Sourcemedia_sourceAd network or partner attributed to the install (e.g., Facebook Ads, googleadwords_int, tiktokglobal_int)
ChannelchannelMarketing channel: paid, organic, owned media, referral
AgencyagencyAgency managing the media spend (when applicable)
PartnerpartnerIntegrated partner identifier (AppsFlyer partner ID)
Is Retargetingis_retargetingWhether the install/event came from a retargeting campaign
Attribution Typeattribution_typeHow attribution was determined: click, impression (view-through), or organic
SDK Versionsdk_versionVersion of the AppsFlyer SDK embedded in the app

Campaign and Ad Dimensions

Campaign dimensions mirror the hierarchy of your ad platform campaigns. They enable granular analysis of which campaigns, ad groups, and individual creatives drive the best installs and downstream conversions.

DimensionAPI FieldDescription
Campaign NamecampaignName of the ad campaign (passed from the ad network)
Campaign IDcampaign_idUnique identifier for the campaign in the ad network
Ad Set NameadsetName of the ad set or ad group within the campaign
Ad Set IDadset_idUnique identifier for the ad set in the ad network
Ad NameadName of the individual ad or creative
Ad IDad_idUnique identifier for the ad in the ad network
Ad Typead_typeCreative format: banner, interstitial, video, native, playable, rewarded
KeywordskeywordsSearch keywords that triggered the ad (for search campaigns)
Site IDsite_idPublisher site or app where the ad was displayed
Sub-Site IDsub_site_idSub-publisher identifier for networks with multiple publishers
Sub Parameters (1-5)sub_param_1 through sub_param_5Custom parameters passed through attribution links for additional segmentation
Click/Impression Timeclick_time / impression_timeTimestamp of the last ad interaction before install
Install Timeinstall_timeTimestamp when the app was first opened after install
Click-to-Install Timeclick_to_install_timeTime elapsed between the click and the install (seconds)

Attribution Dimensions

Attribution dimensions describe how AppsFlyer determined the source of each install — the method, window, and confidence level. These are critical for understanding data quality and the impact of privacy changes on your measurement.

DimensionAPI FieldDescription
Match Typematch_typeAttribution method: id_matching (IDFA/GAID), probabilistic, srn (self-reporting network), or skan
Attribution Lookback Windowattribution_lookbackTime window used for attribution (e.g., 7d click, 1d view)
Is Primary Attributionis_primary_attributionWhether this is the primary attributed source (vs. multi-touch assist)
Re-Attribution Windowreattribution_windowInactivity period before a returning user counts as re-attributed
IDFA/GAIDidfa / advertising_idDevice advertising identifier (available when user opts in on iOS, always on Android)
ATT Statusatt_statusApp Tracking Transparency consent: authorized, denied, not_determined, restricted
Customer User IDcustomer_user_idYour internal user identifier mapped to the AppsFlyer profile
AppsFlyer IDappsflyer_idAppsFlyer's unique device identifier
Countrycountry_codeCountry of the user based on IP geolocation at install time
CitycityCity from IP geolocation
Device Typedevice_typeSpecific device model (iPhone 16, Pixel 9, Samsung Galaxy S25)
OS Versionos_versionOperating system version (iOS 18.2, Android 15)
CarriercarrierMobile carrier network (AT&T, Verizon, T-Mobile, etc.)
LanguagelanguageDevice language setting

Core Metrics

These are the fundamental metrics that measure the top of your acquisition funnel — from ad exposure through installation. They answer the basic question: how effectively are your campaigns driving app installs?

MetricAPI FieldDescriptionFormula / Notes
ImpressionsimpressionsNumber of times your ad was displayedReported by the ad network via cost integration
ClicksclicksNumber of clicks on your adsTracked by AppsFlyer attribution links
Click-Through Rate (CTR)ctrPercentage of impressions that resulted in a click(Clicks ÷ Impressions) × 100
InstallsinstallsTotal attributed (non-organic) installsFirst app open after download, attributed to a media source
Organic Installsorganic_installsInstalls with no attributed media sourceApp Store search, word of mouth, or unmeasured channels
Total Installstotal_installsSum of attributed and organic installsAttributed Installs + Organic Installs
Click-to-Install Rate (CTI)conversion_ratePercentage of clicks that resulted in an install(Installs ÷ Clicks) × 100
Impression-to-Install Rateimpression_conversion_ratePercentage of impressions that resulted in an install(Installs ÷ Impressions) × 100
Re-AttributionsreattributionsLapsed users who returned through a paid touchpointUsers inactive for the re-attribution window who re-engaged
Re-EngagementsreengagementsExisting users who interacted with a retargeting adUsers who still have the app and re-opened it via a retargeting campaign

Conversion Metrics (In-App Events)

In-app event metrics measure what users do after installing your app. These are the most important metrics for understanding the quality of installs from each media source — installs are only valuable if they lead to meaningful user actions.

MetricAPI FieldDescriptionFormula / Notes
Total Eventsevent_counterTotal count of in-app events (all types combined)Sum of all tracked event occurrences
Unique Users (Events)unique_usersUnique users who performed at least one in-app eventDeduplicated by AppsFlyer ID
Purchasesaf_purchaseIn-app purchase eventsStandard event with revenue and currency parameters
Revenueevent_revenueTotal revenue from in-app events with monetary valueSum of revenue parameters across all revenue-generating events
Revenue Per InstallarpuAverage revenue per install (ARPU)Total Revenue ÷ Total Installs
Lifetime Value (LTV)ltvCumulative revenue attributed to each install over timeRevenue accumulated since install, attributed to the original media source
Registrationsaf_complete_registrationAccount registration or signup completionsStandard event for measuring registration funnel
Subscriptionsaf_subscribeSubscription events (free trial or paid)Often includes trial_start and subscribe as separate events
Tutorial Completionaf_tutorial_completionUsers who completed the app onboarding tutorialKey activation metric for user quality
Level Achievedaf_level_achievedGame level or milestone reachedCommon for gaming apps — indicates engagement depth
Add to Cartaf_add_to_cartItems added to shopping cart in e-commerce appsMid-funnel conversion event
Content Viewaf_content_viewProduct or content page views within the appTop-of-funnel engagement event

Retention Metrics

Retention metrics measure how well your app keeps users coming back after they install. These are cohort-based metrics — they track groups of users who installed on the same day or from the same source and measure their return rates over time.

MetricDescriptionFormula / Notes
Day 1 RetentionPercentage of users who opened the app the day after install(Users active on Day 1 ÷ Total Installs in cohort) × 100
Day 3 RetentionPercentage of users who opened the app 3 days after installKey early retention indicator for app quality
Day 7 RetentionPercentage of users who opened the app 7 days after installPrimary benchmark for mobile app retention
Day 14 RetentionPercentage of users who opened the app 14 days after installMid-term retention indicator
Day 30 RetentionPercentage of users who opened the app 30 days after installLong-term retention — strong indicator of sustainable growth
Rolling RetentionPercentage of users active on Day N or any day afterLess strict than classic retention — counts anyone still active
UninstallsNumber of users who uninstalled the appDetected via silent push notifications (24-48 hour delay)
Uninstall RatePercentage of installs that resulted in uninstall(Uninstalls ÷ Installs) × 100 within a time window

Cohort Metrics

Cohort metrics track groups of users over time to measure cumulative performance from each acquisition source. Unlike point-in-time metrics, cohort analysis shows how user value develops over days, weeks, and months — essential for calculating true ROI and payback periods.

MetricDescriptionFormula / Notes
Cumulative RevenueTotal revenue generated by the cohort through Day NRunning total of revenue from install date to Day N
Cumulative ROASReturn on ad spend by Day N for the cohort(Cumulative Revenue through Day N ÷ Ad Spend) × 100
Cumulative EventsTotal in-app events performed by the cohort through Day NRunning total of specific event counts
Revenue Per User (Day N)Average revenue per user in the cohort by Day NCumulative Revenue ÷ Cohort Size
Sessions per User (Day N)Average app sessions per user by Day NTotal sessions ÷ Cohort Size through Day N
Payback PeriodNumber of days until cumulative revenue exceeds ad spendThe day when cumulative ROAS crosses 100%

SKAdNetwork Dimensions

SKAdNetwork (SKAN) is Apple's privacy-preserving attribution framework for iOS. It provides limited but privacy-compliant attribution data without exposing user-level identifiers. Understanding SKAN dimensions is critical for iOS measurement post-ATT.

DimensionDescription
SKAN Campaign IDNumeric campaign identifier (limited values per network)
SKAN Conversion Value6-bit value (0-63) encoding post-install behavior
SKAN Coarse Conversion ValueHigh, medium, or low (SKAN 4.0+ for privacy thresholds)
SKAN Source DomainWebsite domain for web-to-app attribution (SKAN 4.0+)
SKAN Source App IDIdentifier of the app where the ad was shown
SKAN Postback WindowWhich measurement window: first (0-2 days), second (3-7 days), third (8-35 days)
SKAN Network IDAd network identifier registered with Apple
SKAN VersionSKAdNetwork API version: 2.2, 3.0, 4.0
Decoded EventAppsFlyer's decoded interpretation of the conversion value into meaningful events
Decoded Revenue RangeRevenue bracket estimated from the conversion value mapping

Fraud Metrics (Protect360)

Protect360 is AppsFlyer's fraud detection and prevention suite. These metrics help you identify fraudulent installs and events, block them in real-time, and measure the financial impact of fraud on your campaigns.

MetricDescriptionFormula / Notes
Blocked InstallsInstalls flagged and blocked before attributionReal-time blocking prevents fraudulent sources from getting credit
Blocked Install RatePercentage of total install attempts that were blocked(Blocked Installs ÷ Total Install Attempts) × 100
Post-Attribution FraudInstalls initially attributed but later detected as fraudDetected through behavioral analysis after install
Fraud TypeClassification of fraud: bots, click flooding, click injection, device farms, SDK spoofingEach type has different detection signals and remediation
Fraud Rate by SourcePercentage of installs from each media source flagged as fraudIdentifies problematic networks and sub-publishers
Blocked In-App EventsIn-app events from fraudulent installs that were blockedPrevents revenue inflation from fake events
Validation Rules TriggeredCustom fraud rules that flagged suspicious activityUser-defined rules based on CTIT, geo mismatch, device patterns
Estimated SavingsEstimated budget saved by blocking fraudulent installsBlocked Installs × Average CPI for the media source

Cost Metrics

Cost metrics connect your ad spend to AppsFlyer's attribution data, enabling ROI calculations at every level of the campaign hierarchy. Cost data is imported through integrations with ad networks or uploaded manually.

MetricAPI FieldDescriptionFormula / Notes
Total SpendcostTotal ad spend for the campaign or media sourceImported from ad network APIs or uploaded via cost ingestion
Cost Per Install (CPI)cpiAverage cost to acquire one installTotal Spend ÷ Installs
Cost Per Action (CPA)cpaAverage cost per in-app event (configurable event)Total Spend ÷ Target Event Count
Cost Per Click (CPC)average_cpcAverage cost per ad clickTotal Spend ÷ Clicks
eCPMecpmEffective cost per thousand impressions(Total Spend ÷ Impressions) × 1,000
ROASroiReturn on ad spend(Total Revenue ÷ Total Spend) × 100
ROIroiReturn on investment((Revenue - Spend) ÷ Spend) × 100

How to Use AppsFlyer Metrics for Optimization

Knowing which metrics to prioritize at each stage of your app growth strategy is essential. Here is a practical framework for selecting the right metrics.

For user acquisition

Focus on installs, CPI, and CTI rate as your top-of-funnel metrics. Compare CPI across media sources to identify the most cost-efficient channels. Use CTI rate to evaluate creative and targeting quality — a low CTI often indicates poor ad-to-app-store alignment. Monitor organic uplift (the ratio of organic to paid installs) to measure brand awareness generated by paid campaigns.

For user quality assessment

Look beyond installs to Day 1 and Day 7 retention, registration rate, and purchase rate by media source. A source with cheap installs but 5% Day 7 retention is worse than one with expensive installs but 25% retention. Use cohort ROAS at Day 7, Day 14, and Day 30 to understand how quickly each source pays back its acquisition cost.

For revenue optimization

Track ROAS, LTV, and revenue per install by media source and campaign. Use cohort analysis to project long-term revenue from current installs. Compare payback period across sources to allocate budget toward channels that generate sustainable returns, not just cheap installs.

For fraud prevention

Monitor blocked install rate and fraud rate by source daily. Investigate sources with abnormally high CTI rates (>25% is suspicious for display campaigns), short click-to-install times (<10 seconds indicates click injection), or low retention (<2% Day 1 suggests bots). Use validation rules to create custom fraud detection based on your specific patterns.

iOS Privacy and Measurement Challenges

Apple's App Tracking Transparency (ATT) framework, introduced with iOS 14.5, has fundamentally changed mobile attribution. Understanding the impact on your metrics is critical for accurate iOS measurement.

ATT opt-in rates average 30-40% globally, meaning 60-70% of iOS users cannot be attributed using deterministic IDFA matching. For non-opted-in users, AppsFlyer uses a combination of probabilistic modeling and SKAdNetwork data. This means iOS install counts may be less precise than Android, and some installs that should be attributed to paid sources appear as organic.

SKAdNetwork limitations: SKAN provides aggregated, delayed data (24-48 hours) with limited conversion value resolution (0-63). You cannot get user-level data from SKAN, and campaign ID mapping is limited. AppsFlyer's SKAN solution decodes conversion values and provides estimated metrics, but the data is inherently less granular than traditional attribution.

Common Mistakes When Analyzing AppsFlyer Data

Even experienced mobile marketers make these mistakes. Avoiding them will lead to more accurate measurement and better acquisition decisions.

1. Optimizing for CPI instead of downstream value

A media source with $1 CPI but zero purchases is worse than one with $5 CPI and strong conversion rates. Always optimize for CPA (cost per target event) or ROAS rather than raw install cost. Use cohort analysis to understand the true value of installs from each source.

2. Ignoring organic cannibalization

Some paid campaigns claim credit for installs that would have happened organically. Monitor your organic install rate when scaling paid campaigns — if organic installs drop as paid installs rise, you may be paying for users you would have gotten for free. Use incrementality testing to measure true lift.

3. Comparing iOS and Android metrics directly

iOS attribution is fundamentally different post-ATT. iOS install counts include more modeling uncertainty, retention data may be less complete, and cost metrics may be less accurate. Compare iOS and Android metrics within their own context, not against each other.

4. Not accounting for re-attribution windows

If your re-attribution window is 90 days, a user who uninstalled 89 days ago and reinstalls through a retargeting ad counts as a re-attribution — not a new install. But if they reinstall after 91 days, it counts as a new install. This setting significantly affects install counts and cost calculations.

5. Trusting all cost data equally

Cost data quality varies by ad network. Self-reporting networks (Facebook, Google, TikTok) provide their own cost data which may not perfectly align with AppsFlyer's attributed installs. Smaller networks may have delayed or incomplete cost reporting. Always verify cost data accuracy before making major budget decisions.

6. Ignoring fraud signals

Mobile ad fraud remains significant. If a media source shows unusually high CTI rates, extremely short click-to-install times, or near-zero retention, these are strong fraud indicators. Don't wait for Protect360 to flag everything — actively monitor these signals and set up validation rules for proactive detection.