Atticus Li has led marketing analytics at both Silicon Valley Bank (SVB) and NRG Energy, working across Google Analytics (Universal and GA4) and Adobe Analytics to drive measurable business outcomes. At SVB, he built attribution pipelines for a $1B+ startup banking operation. At NRG, he runs 100+ experiments per year across five retail energy brands. This article covers what enterprise analytics actually looks like when the dashboards close and the real work begins.

Two Companies, Two Completely Different Data Worlds

When I moved from SVB to NRG, one of the biggest adjustments wasn't the industry or the products — it was the analytics stack.

At SVB, we used Google Analytics. I was there through the Universal Analytics era and into the GA4 migration. At NRG, the entire measurement layer runs on Adobe Analytics, with Tealium as the CDP and Contentsquare for behavioral analytics.

These aren't just different tools. They're different philosophies.

Google Analytics gives you a standardized data model. Sessions, users, pageviews — they mean roughly the same thing across every GA implementation. The trade-off is flexibility. GA makes assumptions about how you want to measure things, and fighting those assumptions is painful.

Adobe Analytics gives you a blank canvas. You define your own variables, events, and data model. The trade-off is complexity. Two Adobe Analytics implementations can look completely different, because every company builds their own measurement schema.

Neither is better. But switching between them taught me something critical: the tool doesn't define your analytics capability. Your understanding of what the data means does.

The Data Dictionary Problem

Here's something nobody tells you about enterprise analytics: the hardest part isn't building dashboards or running queries. It's understanding what the data actually represents.

At every company I've worked at, the definitions aren't what the names imply.

A "user" in Google Analytics isn't necessarily a person — it's a device/browser combination. A "session" in Adobe Analytics might be defined differently than what Google calls a session, depending on how session timeout rules are configured. A "conversion" might mean one thing in the marketing team's dashboard and something completely different in the finance team's reporting.

At SVB, one of the first things I did was build a data dictionary that documented every metric, dimension, and event — not just what the tool called them, but what they actually measured in the context of SVB's business. Who was counted as a "new user"? What triggered a "lead" event? How was attribution assigned across marketing channels?

At NRG, the same challenge exists at larger scale. With five brands each running their own analytics implementations in Adobe, a 'page view' on one brand's site might not be counted the same way as a 'page view' on another brand's site. Before any cross-brand analysis, I have to reconcile those differences.

This is unglamorous, tedious work. But it's the foundation of everything. Every insight, every test result, every revenue projection is only as good as your understanding of what the underlying data actually means.

Data Storytelling: The Skill Nobody Teaches

At both SVB and NRG, the stakeholders I present to are seeing data for the first time. Not the specific chart — data in general. Many marketing VPs, brand managers, and C-suite executives didn't grow up in data. They're brilliant at their jobs, but they don't think in conversion rates and confidence intervals.

This is where most analysts fail. They build a 40-slide deck packed with every metric they can find, overwhelm the room, and then wonder why nobody acts on the findings.

I've learned to do the opposite. My reporting follows three rules:

1. Surface what matters. If I have 50 data points, I'm showing three. The three that tell the story of what's happening, why it's happening, and what we should do about it. The other 47 live in an appendix for anyone who wants to dig deeper.

2. Always give recommendations. Data without recommendations is just trivia. Every analysis I present ends with "here's what I think we should do and why." Stakeholders don't want to be data analysts — they want to make decisions. My job is to make the decision easier, not harder.

3. You're a consultant, not just an analyst. This mindset shift changed my career. An analyst says "here's what the data shows." A consultant says "here's what the data shows, here's what it means for the business, here's what I recommend, and here's the risk if we do nothing." That second version is what gets you a seat at the strategy table instead of the reporting table.

SVB: What I Built

At Silicon Valley Bank, I was part of the marketing analytics team supporting what became a $1B+ startup banking pipeline. The work spanned several areas, and each one taught me something different about what enterprise analytics demands.

Campaign Landing Pages and Partner Experiments

SVB ran a significant volume of partner landing pages — co-branded pages with accelerators, venture firms, and tech ecosystem partners. I ran A/B tests on these landing pages, optimizing for lead form submissions and qualification rates.

The results were strong: 32% average conversion uplift across the landing page experiments. But the number alone isn't the interesting part. What made these tests work was understanding that SVB's audience — startup founders and CFOs — had fundamentally different page behavior than typical B2C users. They read more, scrolled deeper, and cared more about credibility signals (logos, testimonials, specific program details) than generic value propositions.

Email A/B testing was another significant channel. We tested subject lines, send times, content length, and CTA placement across SVB's email campaigns. The key learning: SVB's audience was so niche and high-intent that most generic email "best practices" (short subject lines, early-morning sends) didn't apply. We had to build our own playbook from scratch.

Customer Acquisition at Scale

The team contributed to 22% year-over-year growth in customer acquisition. My role was building the analytics infrastructure that let us attribute new customers back to specific marketing channels and campaigns.

This sounds straightforward until you realize that enterprise banking customers don't convert in a single session. A founder might see an SVB ad at a conference, visit the website three months later, get referred by their VC six months after that, and finally open an account a year into the relationship. Attributing that conversion to a single touchpoint is meaningless. We built multi-touch attribution models that gave fractional credit across the entire journey.

Geo-Incrementality: The OOH Attribution Problem

One of the projects I'm most proud of at SVB was our geo-incrementality testing for out-of-home (OOH) advertising.

SVB wanted to measure whether their billboard and physical advertising campaigns actually drove online conversions. The problem: OOH advertising has traditionally been unmeasurable. You put up a billboard in Austin and hope for the best.

We designed an incrementality experiment: Austin and Miami were test markets where OOH campaigns ran, and Seattle was the control market with no OOH advertising. By comparing web traffic, lead generation, and conversion patterns between test and control markets — while controlling for other marketing activities — we built what I believe was SVB's first OOH-to-online attribution pipeline.

The results showed measurable incremental lift in web traffic and lead generation in the OOH markets. More importantly, the methodology gave SVB a repeatable framework for measuring offline marketing's online impact — something most companies still can't do.

Consolidating the Reporting Mess

When I arrived at SVB, there were six separate marketing reports going to different stakeholders, each with different metrics, different time frames, and different definitions of success. Some contradicted each other.

I consolidated them into a single Looker dashboard that served as the single source of truth for marketing performance. This drove a 700% increase in web traffic visibility — not because traffic increased that much, but because for the first time, stakeholders could actually see the full picture instead of a fragmented one.

The consolidation process was politically delicate. Every report had an owner who felt ownership over "their" metrics. Replacing six reports with one meant six people who needed to be convinced that a unified view was better than their custom slice. It took months of stakeholder management, but the end result was transformative: fewer meetings, faster decisions, and no more "your numbers don't match my numbers" debates.

NRG: A Different Scale, The Same Principles

At NRG, the analytics challenges are different in specifics but identical in nature. Instead of one brand with one analytics implementation, I'm working across NRG Energy's five retail brands — each with their own Adobe Analytics setup, their own data definitions, and their own stakeholder groups.

The experimentation program I've built here — 150+ total experiments, 100+ per year in 2025 — is fundamentally an analytics operation. Every test requires accurate measurement. Every result requires proper statistical analysis. Every projected revenue impact requires clean data flowing from Adobe Analytics through our reporting infrastructure.

The tools changed. Google Analytics became Adobe Analytics. Looker became internal reporting tools. The startup banking world became retail energy. But the core principles stayed the same:

Understand your data before you trust your data. Build the data dictionary. Validate the event tracking. Question every metric until you understand exactly what it measures and what it doesn't.

Tell stories, not spreadsheets. Your stakeholders don't care about your SQL skills. They care about what's happening in their business and what they should do about it. Package insights as narratives with clear recommendations.

Be a consultant. Don't wait to be asked for data. Proactively surface insights that matter. Build relationships with stakeholders so they come to you with questions before making decisions, not after.

Tie everything to money. At SVB, it was pipeline value and customer acquisition cost. At NRG, it's revenue per customer and projected annual lift. The metric changes, but the principle doesn't: if you can't connect your analytics work to revenue, you'll always be a cost center instead of a profit driver.

What Enterprise Analytics Actually Looks Like

I want to be honest about something: most of what I've described isn't exciting. Building data dictionaries, reconciling metric definitions across brands, sitting in meetings to explain what a confidence interval means to a VP — this isn't the sexy side of analytics.

But it's the real work. And it's the work that creates the foundation for everything else. The $1.2M+ in projected annual impact from NRG's 2025 test wins? That only happened because the measurement was solid. The PRISM framework? It only works if the underlying data is trustworthy.

If you're an analyst trying to level up, stop optimizing your SQL queries and start optimizing your stakeholder communication. Learn to tell stories with data. Learn to give recommendations, not just reports. Learn to be the person in the room who translates numbers into decisions.

That's what enterprise marketing analytics actually looks like. Not from the stage at a conference, but from the seat where the work gets done.

Want to discuss analytics strategy? Reach out at [email protected].

Share this article
LinkedIn (opens in new tab) X / Twitter (opens in new tab)
Atticus Li

Leads applied experimentation at NRG Energy. $30M+ in verified revenue impact through behavioral economics and CRO.