You've set up your experiment, added a beautiful audience condition, and launched. Three days in, your sample size is a fraction of what you projected. Or worse — you're seeing results that don't make sense because the wrong people are in your test.

Audience targeting in Optimizely is deceptively simple on the surface and surprisingly tricky in practice. This guide covers everything the official docs gloss over.

Audience Targeting vs URL Targeting: The Confusion That Kills Experiments

This is the single most common setup mistake I see. People conflate these two concepts constantly, and they operate on entirely different axes:

  • Audience targeting controls WHO sees the experiment — based on visitor attributes like device type, geography, cookie values, referral source, query parameters, or custom attributes.
  • URL targeting controls WHERE the experiment runs — which pages trigger the experiment code.

You need both. And they interact multiplicatively, not additively. A visitor must satisfy BOTH the URL targeting AND the audience conditions to enter your experiment. Miss this and you'll either include visitors you didn't intend to, or exclude nearly everyone.

Here's the practical distinction: if you're building a test for "logged-in users," that's an audience condition (checking a cookie or custom attribute). If you're building a test for "the checkout page," that's URL targeting. If you want "logged-in users on the checkout page," you need both set correctly.

**Pro Tip:** Use URL targeting for page-level scoping and audience conditions for visitor-level scoping. Never try to do both jobs with one mechanism — you'll create maintenance nightmares and targeting bugs.

AND/OR Logic: Real Examples That Actually Make Sense

Optimizely lets you combine multiple audience conditions using AND and OR logic. Getting this wrong is how you accidentally include the wrong segment or exclude nearly everyone.

OR logic = any condition is sufficient to qualify. Use this when you want to broaden your audience.

Example: You want to target visitors who came from either a paid search campaign OR a social media campaign.

  • Condition 1: UTM source = google / cpc
  • OR
  • Condition 2: UTM source = facebook

Result: Any visitor matching either condition enters the experiment. Your audience is the union of both groups.

AND logic = all conditions must be true simultaneously. Use this when you need to narrow to a specific segment.

Example: You want to target mobile users who are also loyalty members.

  • Condition 1: Device = mobile
  • AND
  • Condition 2: Cookie "loyaltytier" exists

Result: Only visitors who are BOTH on mobile AND have the loyalty cookie enter. Your audience is the intersection — which is always smaller than either condition alone.

The dangerous pattern: mixing AND and OR without thinking through the logic. Optimizely evaluates conditions in groups. Within a single audience, conditions are ANDed. Between multiple audiences added to the same experiment, Optimizely ORs them. This trips people up constantly.

**Pro Tip:** When you add multiple audiences to an experiment, Optimizely combines them with OR — visitors qualifying for ANY of the audiences are included. If you want AND logic across different attribute types, put all conditions inside a single audience definition.

Cookie-Based Targeting: Step-by-Step Setup

Cookie targeting is one of the most powerful and most misunderstood features. Here's how to do it correctly.

Setting Up a Cookie Condition

  1. Navigate to your Optimizely project, go to Audiences
  2. Click Create New Audience
  3. Add a condition of type Cookie
  4. Enter the cookie name exactly (case-sensitive)
  5. Choose the match type: exists, equals, contains, matches (regex)
  6. Enter the value if applicable

The Case Sensitivity Trap

Cookie names and values are case-sensitive in Optimizely's evaluation. usertype=premium is not the same as usertype=Premium. I've seen this waste entire weeks of test data. Always check your actual cookie values in the browser DevTools before building the audience.

First-Party Cookie Example

Say you set a cookie abeligible=true for users who've completed onboarding. Your audience condition:

  • Cookie name: abeligible
  • Match type: equals
  • Value: true

This will only fire for users where that exact cookie name and value are present.

**Pro Tip:** Use browser DevTools (Application > Cookies) to verify the exact cookie name, value, and domain scope before building your audience condition. Mismatched domain scope (e.g., cookie set on subdomain but experiment running on root domain) is a frequent culprit for zero audience matches.

Dynamic Evaluation: The Timing Trap Most Teams Miss

Here's something the Optimizely docs mention briefly but don't explain well enough: audience conditions are evaluated every time the experiment checks for activation, not just once at session start.

This has real consequences:

The "time of day" example: If your audience includes a condition like "hour of day between 9am and 5pm," a visitor who arrives at 4:55pm and lingers on the page past 5:00pm could theoretically change their qualification status mid-session. In practice, once a visitor is bucketed into a variation, they stay there — but the initial evaluation timing matters for when they get bucketed.

The custom attribute timing problem: If you're using optimizely.push({type: 'user', attributes: {...}}) to set custom attributes, the order of operations matters. The attributes must be set BEFORE the experiment activation code runs, otherwise the audience check fires with incomplete data and the visitor gets excluded.

The correct order:

  1. Set user attributes
  2. Activate experiment / let Optimizely evaluate

Reverse this and you'll have an audience that looks empty even when it shouldn't be.

**Pro Tip:** If you're using Optimizely's snippet-based Web Experimentation and relying on custom attributes from your data layer, add the attribute push in the `<head>` before the Optimizely snippet. Attributes set after the snippet loads will miss the initial evaluation window.

Why Your Audience Is Always Smaller Than You Think: The Funnel Math

This is the conversation I have with every stakeholder when they see the estimated audience size and say "that can't be right."

Here's the math on a typical e-commerce site with 100,000 monthly visitors:

  • Start: 100,000 monthly visitors
  • After URL targeting (checkout page only): 18,000 (18% reach checkout)
  • After device targeting (desktop only): 9,000 (50% desktop)
  • After loyalty cookie condition: 2,700 (30% have loyalty status)
  • After traffic allocation (50% in experiment): 1,350 in experiment

1,350 visitors in a test you thought would have 50,000. This is why experiment duration estimates based on total site traffic are almost always wrong.

The practical implication: every condition you add shrinks your audience multiplicatively. If you need statistical significance in 4 weeks, you can only afford maybe two or three tightly scoped conditions. More than that and you're looking at months-long tests that are impractical to run.

**Pro Tip:** Before finalizing your audience conditions, go to Google Analytics (or your analytics tool) and run a segment matching all your conditions. The number of users in that segment over the last 30 days is your realistic eligible audience size. Divide by 2 for your test group. Plug that into a sample size calculator to get a realistic test duration.

Geographic and Device Targeting Setup

Device Targeting

In Optimizely Web Experimentation, device targeting is available under the Device attribute in audience conditions. Options typically include:

  • Desktop
  • Tablet
  • Mobile

One gotcha: Optimizely uses the user agent string to determine device type. If your site uses responsive design and you care about actual viewport behavior, consider whether "device type" is really what you want, or whether you'd be better served by a custom attribute based on viewport width set in your JavaScript.

Geographic Targeting

Geographic conditions use IP-based geolocation. Accuracy is typically high for country-level targeting (95%+) and degrades as you go more granular — city-level accuracy is more like 80-85%. For US state-level targeting, it's reasonably reliable.

Setup: Audience > Add Condition > Location. You can target by country, region/state, city, or DMA.

Important limitation: IP geolocation misclassifies VPN users, corporate proxies, and mobile users on carrier networks. If you're running a geo-targeted test for a legal or regulatory reason (not just optimization), rely on this with caution.

**Pro Tip:** For high-stakes geo experiments (different pricing, different offers by region), supplement Optimizely's IP-based geolocation with a user-provided location signal — like a ZIP code from their account profile — stored as a custom attribute. IP geolocation alone has enough error to contaminate results in closely contested tests.

Troubleshooting When Audience Conditions Aren't Working

These are the actual root causes I've encountered over 7+ years:

Problem: Zero visitors entering the experiment

  • Check URL targeting first — is the experiment even firing on the right pages?
  • Open DevTools > Network, filter for optimizely requests, check the decision call
  • Use the Optimizely extension or browser console: window.optimizely.get('state').getExperimentStates()

Problem: Audience size is much lower than expected

  • Check condition logic (AND vs OR)
  • Verify cookie names/values are case-exactly correct
  • Check if attributes are being set before experiment activation
  • Verify the audience is actually attached to the experiment (not just created)

Problem: Wrong visitors are being included

  • You may have multiple audiences ORed together unintentionally
  • Check for a default audience that catches all visitors

Problem: Results look contaminated

  • Someone may have changed the audience conditions after the experiment started — any change mid-test invalidates the data
  • Check the experiment change log in Optimizely
**Pro Tip:** Optimizely's Chrome extension ("Optimizely Web Helper") will tell you exactly why a visitor did or didn't qualify for an experiment, showing which conditions passed or failed. Install it. Use it every time you're debugging targeting before pulling data.

Common Mistakes

Using URL targeting to approximate an audience. Targeting /account/ doesn't mean "logged-in users." A user could be on the account pages while not logged in. Build a proper cookie or custom attribute condition.

Setting audience conditions after the fact. Changing audience conditions on a live experiment means your historical data was collected under different targeting rules. The data is now split in a way you can't analyze cleanly. Stop the experiment, create a new one.

Overloading a single test with too many audience conditions. More conditions = smaller audience = longer test duration. Most experiments should have 1-2 audience conditions max. If you have 5 conditions, you're building a personalization campaign, not an experiment.

Forgetting that "excluded" means excluded from results too. Visitors who don't meet audience conditions don't appear in your Results page at all. Your reported conversion rate is only for the targeted segment — which is usually different from your overall site CVR. Be careful when reporting upward.

Not testing the audience logic in a QA environment first. Always use Optimizely's preview mode or a QA cookie to manually verify you end up in the experiment before launching to real traffic.

What to Do Next

  1. Audit your live experiments — open each one and verify the audience conditions make logical sense with AND/OR, and that all conditions are actually needed. Remove any that aren't driving a real segmentation need.
  2. Build a reusable audience library — in Optimizely, create saved audiences for your most common segments (mobile users, logged-in users, returning visitors, loyalty members). Reuse them across experiments instead of rebuilding each time.
  3. Run the funnel math before every new experiment — use GA or your analytics tool to calculate the realistic eligible audience size before you write a single line of test code.
  4. Install the Optimizely Web Helper Chrome extension and use it to verify targeting on your next experiment before it goes live.
Share this article
LinkedIn (opens in new tab) X / Twitter (opens in new tab)
Written by Atticus Li

Revenue & experimentation leader — behavioral economics, CRO, and AI. CXL & Mindworx certified. $30M+ in verified impact.