Adjust is one of the leading mobile measurement platforms (MMPs), providing app marketers with attribution data, audience analytics, and fraud prevention. Whether you're pulling data through the Adjust API, building dashboards in Datascape, or exporting raw callback data, understanding every available dimension and metric is essential for accurate mobile marketing analysis.

This guide covers every dimension and metric available in Adjust as of 2026. We've organized them by category, included API field names, and added practical context on when and how each one matters. From basic install attribution through advanced fraud detection and SKAdNetwork decoding, this is your complete reference.

What Are Dimensions vs Metrics in Adjust?

In Adjust's reporting model, dimensions are the attributes that describe and segment your data — like app name, network, campaign, country, or OS version. They define how your reports are sliced and filtered.

Metrics are the quantitative measurements — installs, clicks, sessions, revenue, retention rates. They tell you what happened and how well your campaigns performed.

Adjust's reporting interface (Datascape) and API let you combine any dimensions with any metrics to build custom views. The raw data export (callbacks and CSV exports) provides event-level records with all dimension fields attached to each activity.

App Dimensions

App dimensions identify the application being tracked and its configuration within Adjust. These are the top-level identifiers that separate data across different apps in your Adjust account.

DimensionAPI FieldDescription
App NameappHuman-readable name of the app as configured in Adjust
App Tokenapp_tokenUnique Adjust identifier for the app (alphanumeric token)
Store App IDstore_idApp Store bundle ID (iOS) or Google Play package name (Android)
Platformos_nameOperating system: ios, android, windows, or other supported platforms
OS Versionos_versionSpecific operating system version (e.g., 17.4, 14.0)
Device Typedevice_typeHardware category: phone, tablet, or other
Device Namedevice_nameSpecific device model (e.g., iPhone 15, Samsung Galaxy S24)
SDK Versionsdk_versionVersion of the Adjust SDK integrated in the app
App Versionapp_versionBuild version of the app at the time of the tracked activity
EnvironmentenvironmentSandbox (testing) or production — filters test data from real traffic
CountrycountryUser's country by ISO 3166-1 alpha-2 code based on IP geolocation
RegionregionGeographic region or state within the country
CitycityCity-level geolocation based on IP address
LanguagelanguageDevice language setting (ISO 639-1 code)
TimezonetimezoneDevice timezone offset from UTC

Tracker and Campaign Dimensions

Adjust uses a four-level tracker hierarchy to organize attribution data. Each level provides increasingly granular detail about where users came from. These dimensions are the backbone of all acquisition analysis in Adjust.

DimensionAPI FieldDescription
Tracker Tokentracker_tokenUnique alphanumeric identifier for the tracker link
Tracker Nametracker_nameHuman-readable name of the tracker as configured in Adjust
NetworknetworkTop-level attribution source: the ad network or partner (e.g., Meta Ads, Google Ads, TikTok)
CampaigncampaignSecond-level grouping: the campaign name within the network
AdgroupadgroupThird-level grouping: the ad group or ad set within the campaign
CreativecreativeFourth-level grouping: the specific creative or ad variation
Click LabellabelCustom label parameter attached to the tracker URL for additional segmentation
Engagement Typeengagement_typeHow the user engaged: click, impression, or unattributed (organic)
Attribution Typeattribution_typeMethod used: device matching, probabilistic, SKAdNetwork, or self-attributing network
Match Typematch_typeHow the click was matched to the install: deterministic (IDFA/GAID) or probabilistic
Activity Kindactivity_kindType of tracked activity: install, session, event, reattribution, uninstall, reinstall
DeeplinkdeeplinkDeeplink URL that opened the app, if applicable

Network and Partner Dimensions

Network dimensions provide metadata about the advertising partners and their integration type with Adjust. These are especially useful when managing multiple ad networks and comparing partner performance across your portfolio.

DimensionAPI FieldDescription
Partner Namepartner_nameName of the integrated ad network or partner
Partner IDpartner_idAdjust's internal identifier for the partner integration
Network Typenetwork_typeIntegration method: self-attributing network (SAN), module partner, or standard
SAN Click IDsan_click_idClick identifier from self-attributing networks (Meta, Google, Snap, TikTok)
Impression Basedimpression_basedWhether the attribution was based on an impression (view-through) rather than a click
Cost Modelcost_modelPricing model used by the partner: CPI, CPC, CPM, or CPA
Store Typestore_typeApp store where the install originated: apple_app_store, google_play, huawei_appgallery, etc.
FB Install Referrerfb_install_referrerFacebook-specific install referrer data for Meta campaign attribution
Google Play ReferrerreferrerGoogle Play install referrer string for Android attribution

Core Metrics: Installs, Clicks, and Impressions

These fundamental metrics measure the volume and efficiency of your user acquisition campaigns. They form the basis of all performance analysis in Adjust — from initial ad exposure through successful app installation.

MetricAPI FieldDescriptionFormula / Notes
InstallsinstallsNumber of first app opens attributed to a sourceCounted when the app is opened for the first time after download
ClicksclicksTotal clicks on tracker linksIncludes all click engagements on Adjust tracking URLs
ImpressionsimpressionsAd impressions recorded through impression tracking URLsRequires impression URL integration with the ad network
Click-Through Rate (CTR)ctrPercentage of impressions that resulted in a click(Clicks ÷ Impressions) × 100
Conversion Rate (CVR)cvrPercentage of clicks that resulted in an install(Installs ÷ Clicks) × 100
Impression Conversion Rateimpression_cvrPercentage of impressions that led to an install(Installs ÷ Impressions) × 100
ReinstallsreinstallsUsers who uninstalled and then reinstalled the appTracked separately from first installs to avoid double-counting
ReattributionsreattributionsInactive users re-engaged through a new paid campaignRequires the user to be past the inactivity window (default 7 days)
UninstallsuninstallsNumber of app uninstalls detectedDetected via silent push notifications (iOS) or server-side checks (Android)
UpdatesupdatesApp version updates detected from existing usersTracked when app_version changes between sessions

Conversion Metrics: Events, Sessions, and Revenue

Conversion metrics track what happens after the install — in-app events, session activity, and revenue generation. These metrics connect acquisition spend to actual business outcomes and are essential for calculating ROI and optimizing toward high-value users.

MetricAPI FieldDescriptionFormula / Notes
SessionssessionsTotal number of app sessions (opens)A new session is counted after the subsession interval (default 30 minutes)
Session Lengthsession_lengthTotal cumulative time spent in-app across all sessionsMeasured in seconds — available as total and average
EventseventsTotal count of tracked in-app eventsAggregates all configured event tokens; filter by event_token for specific events
Event Tokenevent_tokenUnique identifier for a specific in-app event typeDimension used to filter metrics to a particular event (e.g., purchase, registration)
RevenuerevenueTotal revenue from tracked purchase eventsSum of all revenue events in the reporting currency
Revenue per Eventrevenue_per_eventAverage revenue generated per tracked eventRevenue ÷ Events
Revenue per Installrevenue_per_installAverage revenue per attributed installRevenue ÷ Installs
Ad Revenuead_revenueRevenue from in-app advertising (ad monetization)Requires ad revenue SDK integration (AdMob, ironSource, etc.)
Subscription Revenuesubscription_revenueRevenue from subscription purchases and renewalsTracked through server-to-server integration with app stores
Lifetime Value (LTV)lifetime_valueCumulative revenue per user from install through the reporting dateIncludes all revenue types: IAP, subscriptions, and ad revenue
DAU (Daily Active Users)dauUnique users who opened the app on a given dayDeduplicated count of users with at least one session per day
MAU (Monthly Active Users)mauUnique users who opened the app within a 30-day windowRolling 30-day window of unique active users
ARPUarpuAverage revenue per userRevenue ÷ Active Users for the period
ARPPUarppuAverage revenue per paying userRevenue ÷ Users with at least one purchase

Retention Metrics

Retention metrics measure how well you keep users engaged over time. Adjust provides cohort-based retention data that groups users by their install date and tracks how many return on each subsequent day. These metrics are critical for evaluating user quality across acquisition channels.

MetricAPI FieldDescription
Day 0 Retentionretention_rate_d0Percentage of users who had a session on the same day they installed (always 100%)
Day 1 Retentionretention_rate_d1Percentage of users who returned exactly 1 day after install
Day 3 Retentionretention_rate_d3Percentage of users who returned exactly 3 days after install
Day 7 Retentionretention_rate_d7Percentage of users who returned exactly 7 days after install
Day 14 Retentionretention_rate_d14Percentage of users who returned exactly 14 days after install
Day 30 Retentionretention_rate_d30Percentage of users who returned exactly 30 days after install
Rolling Retention Day 7rolling_retention_d7Percentage of users active on Day 7 or any day after
Rolling Retention Day 30rolling_retention_d30Percentage of users active on Day 30 or any day after
Return Ratereturn_ratePercentage of users who returned to the app at least once after install day
Sessions per Usersessions_per_userAverage number of sessions per user in the cohort
Time to First Eventtime_to_first_eventAverage time between install and first tracked in-app event
Time to First Purchasetime_to_first_purchaseAverage time between install and first revenue event

How to read retention data: If your Day 1 retention is 35%, it means 35% of users who installed on a given date opened the app exactly one day later. Industry benchmarks vary widely: gaming apps typically see 25-35% Day 1 retention while utility apps may see 15-25%. Compare retention by network and campaign to identify which acquisition sources bring the highest-quality users.

Cohort Metrics

Cohort metrics group users by their install date and track cumulative behavior over time. Unlike standard retention (which measures if users returned on a specific day), cohort metrics aggregate totals within time windows, giving you a longer-term view of user value development.

MetricAPI FieldDescription
Cohort Sizecohort_sizeNumber of users in the install cohort
Cohort Revenue Day 0cohort_revenue_d0Total revenue from the cohort on install day
Cohort Revenue Day 7cohort_revenue_d7Cumulative revenue from the cohort within 7 days of install
Cohort Revenue Day 14cohort_revenue_d14Cumulative revenue from the cohort within 14 days of install
Cohort Revenue Day 30cohort_revenue_d30Cumulative revenue from the cohort within 30 days of install
Cohort Events Day 7cohort_events_d7Total tracked events within 7 days of install for the cohort
Cohort Sessions Day 7cohort_sessions_d7Total sessions within 7 days of install for the cohort
Cohort ROAS Day 7cohort_roas_d7Return on ad spend calculated from 7-day cohort revenue
Cohort ROAS Day 30cohort_roas_d30Return on ad spend calculated from 30-day cohort revenue
Cohort LTVcohort_ltvCumulative lifetime value per user in the cohort at the current date

Cohort analysis is particularly powerful for comparing acquisition channels on equal footing. A campaign with a higher CPI but better Day 30 cohort revenue may ultimately deliver superior ROI compared to a cheaper campaign that acquires low-value users. Always evaluate campaigns using cohort ROAS at 7, 14, and 30 days rather than relying solely on install volume.

SKAdNetwork Dimensions

Apple's SKAdNetwork (SKAN) provides privacy-preserving install attribution for iOS campaigns. Adjust decodes SKAN postbacks into meaningful dimensions and metrics. These fields are essential for iOS campaign measurement in a post-ATT world where device-level attribution is limited.

DimensionAPI FieldDescription
SKAN Campaign IDskan_campaign_idApple SKAdNetwork campaign identifier (integer, 0-99 in SKAN 4.0)
Conversion Valueskan_conversion_value6-bit value (0-63) encoding post-install activity within the measurement window
Coarse Conversion Valueskan_coarse_valueSKAN 4.0 simplified value: low, medium, or high
Source App IDskan_source_app_idApp Store ID of the app where the ad was displayed
Source Domainskan_source_domainSKAN 4.0 web attribution: domain where the ad was displayed
Fidelity Typeskan_fidelity_typeAttribution fidelity: view-through (0) or StoreKit-rendered (1)
Postback Sequence Indexskan_postback_sequenceSKAN 4.0 postback window: 0 (0-2 days), 1 (3-7 days), 2 (8-35 days)
SKAN Versionskan_versionSKAdNetwork protocol version used for the postback (2.0, 3.0, 4.0)
Decoded Revenueskan_decoded_revenueRevenue value decoded from the conversion value based on the configured schema
Decoded Eventskan_decoded_eventIn-app event decoded from the conversion value mapping
Null Conversion Valueskan_null_conversionWhether the postback had a null conversion value (privacy threshold not met)

Fraud Prevention Metrics

Adjust's Fraud Prevention Suite detects and rejects invalid traffic in real time. These metrics quantify the volume and type of fraudulent activity across your campaigns, helping you protect ad spend and maintain clean attribution data.

MetricAPI FieldDescription
Rejected Installsrejected_installsTotal installs rejected by the fraud prevention suite
Rejected Reattributionsrejected_reattributionsReattributions flagged and rejected as fraudulent
Click Injectionfraud_click_injectionInstalls where a fraudulent click was injected between download and first open
SDK Spoofingfraud_sdk_spoofingFake installs or events created by simulating SDK calls without a real device
Fake Installsfraud_fake_installsInstalls from device farms or emulators detected through behavioral analysis
Anonymous IPfraud_anonymous_ipInstalls originating from VPNs, proxies, or Tor exit nodes
Distribution Outlierfraud_distribution_outlierInstalls with click-to-install time patterns that deviate from natural distribution
Too Many Engagementsfraud_too_many_engagementsUsers with an abnormally high number of clicks (click flooding)
Untrusted Devicesfraud_untrusted_devicesInstalls from devices with suspicious characteristics (rooted, jailbroken, emulators)
Rejection Reasonrejection_reasonDimension indicating why the install was rejected (click_injection, sdk_spoofing, etc.)
Fraud Detection Ratefraud_ratePercentage of total installs flagged as fraudulent

Interpreting fraud metrics: A healthy fraud rate is below 10% across your portfolio. Rates above 20% for specific networks or campaigns indicate serious issues that require investigation. Click injection is the most common mobile ad fraud type — it doesn't generate fake installs but steals attribution from organic or other paid sources. SDK spoofing is more damaging because it creates entirely fake installs that never represent real users.

Cost Metrics

Cost metrics connect your acquisition spending to performance outcomes, enabling ROI analysis and budget allocation decisions. Adjust ingests cost data from integrated ad networks and combines it with attribution data to calculate efficiency metrics.

MetricAPI FieldDescriptionFormula / Notes
CostcostTotal ad spend reported by the ad networkIngested via cost API integrations with each network
Cost per Install (CPI)cpiAverage cost to acquire one installCost ÷ Installs
Effective CPI (eCPI)ecpiEffective cost per install including all campaign costsTotal Cost ÷ Total Installs (across all cost models)
Cost per Click (CPC)cost_per_clickAverage cost per click on tracker linksCost ÷ Clicks
Cost per Mille (CPM)cost_per_milleCost per 1,000 impressions(Cost ÷ Impressions) × 1,000
Cost per Eventcost_per_eventAverage cost per tracked in-app eventCost ÷ Events (filter by event_token for specific event types)
ROASroasReturn on ad spendRevenue ÷ Cost
ROAS (Ad Revenue)roas_ad_revenueReturn on ad spend from in-app ad monetization revenueAd Revenue ÷ Cost
ROAS (Total)roas_totalCombined ROAS including IAP, subscription, and ad revenue(Revenue + Ad Revenue) ÷ Cost
Payback Periodpayback_periodNumber of days until cumulative revenue exceeds acquisition costDay at which cohort revenue ≥ CPI

How to Use Adjust Metrics for Campaign Optimization

Having access to all these metrics is powerful, but knowing which ones to prioritize at each stage of optimization determines your effectiveness as a mobile marketer. Here's a practical framework for selecting the right Adjust metrics for different objectives.

For user acquisition scaling

Focus on CPI, install volume, and CVR to understand acquisition efficiency. Compare CPI across networks and campaigns, but never optimize for CPI alone — a low CPI with poor retention wastes budget. Always pair CPI analysis with Day 7 retention to ensure you're acquiring quality users, not just cheap installs.

For retention optimization

Track Day 1, Day 7, and Day 30 retention rates broken down by network and campaign. Identify which acquisition sources produce the stickiest users. If Day 1 retention is strong but Day 7 drops sharply, the app's onboarding may succeed but the core experience fails to retain. If Day 1 is already low, the targeting or creative messaging may be attracting the wrong users.

For revenue optimization

Use cohort ROAS at Day 7, Day 14, and Day 30 to compare the true revenue return from different campaigns. Combine with ARPPU and payback period to understand how quickly campaigns become profitable. Campaigns with a payback period under 14 days are typically safe to scale, while those exceeding 30 days require careful evaluation of long-term LTV projections.

For fraud monitoring

Review rejected installs and fraud rate by network weekly. Set up alerts for sudden spikes in fraud metrics. Pay special attention to click injection rates on Android campaigns and SDK spoofing across all platforms. If a network consistently shows fraud rates above 15%, escalate to the partner or consider pausing the source entirely.

For iOS campaign measurement

With limited device-level data on iOS, rely on SKAN conversion values, decoded revenue, and coarse conversion values to understand campaign performance. Configure your conversion value schema carefully to encode the metrics that matter most — typically a combination of revenue ranges and key event completions. Monitor null conversion value rates to understand how much data you are losing to Apple's privacy thresholds.

Key Differences Between Adjust and Other MMPs

While Adjust shares many concepts with other mobile measurement platforms like AppsFlyer, Branch, and Singular, there are important differences in terminology and data modeling that affect how you work with the data.

Tracker hierarchy vs. campaign hierarchy

Adjust uses a unique tracker-based hierarchy (Network > Campaign > Adgroup > Creative) tied to tracker tokens, while other MMPs typically use a campaign-based hierarchy pulled directly from ad network APIs. This means Adjust's attribution structure is defined at the tracker link level rather than inherited from the ad network's campaign structure.

Session definition

Adjust defines a new session when the user opens the app after the subsession interval (default 30 minutes of inactivity). This differs from some analytics platforms that use 30-minute inactivity windows calculated differently. When comparing session metrics across platforms, ensure you understand each platform's session definition.

Reattribution window

Adjust's default inactivity window for reattributions is 7 days — meaning a user must be inactive for 7 days before a new paid touchpoint can trigger a reattribution. This is configurable per app. Other MMPs may use different default windows, making direct reattribution comparisons unreliable unless windows are aligned.

Common Mistakes When Analyzing Adjust Data

Even experienced mobile marketers make these mistakes when working with Adjust metrics. Avoiding them will improve the accuracy of your analysis and optimization decisions.

1. Comparing CPI without retention context

A network delivering installs at $0.50 CPI looks better than one at $2.00 CPI — until you check retention. If the cheap installs have 10% Day 7 retention versus 40% for the expensive ones, the $2.00 source delivers far more value. Always pair cost metrics with retention and cohort revenue data before making budget allocation decisions.

2. Ignoring fraud metrics

If you don't regularly review rejected installs and fraud rates by source, you're likely paying for fraudulent traffic without knowing it. Even with Adjust's automatic rejection, some fraud can evade detection. Cross-reference abnormally high install volumes or unusually low post-install event rates as additional fraud indicators.

3. Mixing attributed and organic data

Adjust tracks both attributed (paid) and organic installs. When analyzing campaign performance, always filter to attributed installs only. Including organic installs in network-level reports inflates apparent performance and distorts CPI and ROAS calculations.

4. Misinterpreting SKAdNetwork conversion values

SKAN conversion values are encoded — a value of 42 does not mean 42 purchases or $42 in revenue. The meaning depends entirely on your configured conversion value schema. Always decode values using your schema mapping before drawing conclusions. Also account for null conversion values, which represent users who didn't meet Apple's privacy threshold and provide no post-install signal.

5. Using daily retention instead of rolling retention for LTV models

Daily retention (exact day return) understates ongoing engagement because many active users skip specific days. Rolling retention (active on Day N or later) gives a more accurate picture of how many users remain active over time. For LTV projections, rolling retention curves produce more reliable forecasts than exact-day retention rates.

6. Not accounting for attribution windows

Adjust's default click attribution window is 7 days and impression window is 24 hours. If you change these windows, historical comparisons become invalid. A campaign measured with a 14-day click window will naturally show more installs than the same campaign measured with a 7-day window. Document your attribution settings and keep them consistent when comparing time periods.