Before building Jobsolv's AI platform, Atticus Li validated the market by offering done-for-you resume services at $2,000-$3,000 per client, serving 26 clients with a 92.3% interview success rate within 30 days. This services-first approach generated real revenue, proved real demand, and built the playbook that the AI product now automates at scale.

Why I Didn't Build the Product First

Every founder I talk to wants to start with the product. They have a vision for the platform, they've sketched the UI, they've picked their tech stack. They want to build.

I get it. Building is more fun than selling. But building before validating is how most startups die — not because the product was bad, but because nobody wanted it enough to pay for it.

When I started Jobsolv, the hypothesis was simple: job seekers were wasting enormous amounts of time applying to jobs they weren't qualified for, with resumes that didn't match the role. An AI platform that automated tailored applications could solve both problems.

But that hypothesis had assumptions I needed to test. Would people pay for this? How much? Would the results actually be good enough to justify the price? What specific outcomes did customers care about — more interviews, better companies, faster offers?

The cheapest and fastest way to answer those questions wasn't to build software. It was to do the work by hand.

The Done-for-You Phase

I launched what I called the Jobsolv Signature Service. High-touch, completely manual, and priced accordingly: $2,000-$3,000 per client.

For that price, clients got a full service: resume rewrite optimized for ATS systems, cover letter templates, targeted job search, and tailored applications submitted on their behalf. I did the work myself initially, then brought on help as demand grew.

The pricing was deliberate. I didn't want bargain hunters. I wanted people who were serious enough about their job search to invest real money, because those were the people who would give me honest feedback and who would represent the kind of customer the SaaS product would eventually need to satisfy.

Over the course of the services phase, I worked with 26 clients. The results validated the hypothesis decisively.

92.3% interview success rate within 30 days. Not 30 days from when they started applying — 30 days from when they signed up. That's the metric that mattered. Job seekers don't care about your ATS score or your resume formatting. They care about one thing: am I getting interviews?

Twenty-four out of 26 clients got interviews within a month. That's not a marginal improvement over the baseline. That's a fundamentally different outcome than most job seekers experience when applying on their own.

What I Learned From 26 Clients

The services phase wasn't just about validation. It was about learning things that I couldn't have learned from market research or customer surveys.

Learning 1: The resume isn't the bottleneck — targeting is. Most clients came to me thinking their resume was the problem. And their resumes usually did need work. But the bigger issue was that they were applying to the wrong jobs. They were spray-and-pray applying to hundreds of positions, most of which they weren't qualified for. When I targeted their applications to roles that matched their actual experience and skills, interview rates went up dramatically — sometimes before I even rewrote the resume.

This insight shaped the entire Jobsolv product architecture. The AI doesn't just fix your resume. It matches you to jobs where you're genuinely competitive and tailors each application to the specific role.

Learning 2: People will pay $2K-$3K for results. This wasn't obvious. The resume writing market is crowded with $200-$500 services. I was charging 5-10x the market rate. But at that price point, I attracted clients who valued outcomes over costs — and 26 of them said yes, generating approximately $58K+ in services revenue before I wrote a single line of product code.

Learning 3: The process has repeatable patterns. After working with 26 clients manually, I could see exactly which steps were formulaic (ATS optimization, keyword matching, format compliance) and which required human judgment (career narrative positioning, industry-specific language). The formulaic steps were automatable. The judgment calls could be augmented by AI but might still need human review for premium tiers.

This decomposition — understanding which parts of the service were mechanical and which were creative — was the blueprint for the SaaS product's feature set.

The Transition to Product

With 26 clients validating demand, a 92.3% success rate validating outcomes, and $58K+ validating willingness to pay, I had enough evidence to build.

Jobsolv launched as a SaaS platform in March 2024. The initial product automated the targeting and resume tailoring that I'd been doing by hand, using AI to match candidates to suitable roles and customize their applications.

The early traction confirmed that the services phase had identified real demand:

  • 304 users at launch — mostly from the services client network and their referrals
  • 8,233 users within six months — 111.5% month-over-month growth rate
  • 30,000+ users to date — organic growth with effectively zero ad spend

I wrote about how we grew to 30K users without paid advertising in detail. The short version: product-led growth works when the product actually solves the problem, and the services phase proved that it did before we had to rely on the product to prove itself.

Building the Team

Scaling from a services operation to a SaaS platform required a team I didn't have. At peak, Jobsolv had a 27-member cross-functional team. Building and managing that team was its own education.

The development team was a 3-5 person agency based in Egypt. I chose an agency over individual contractors because I needed a team that could handle sprint planning, code reviews, and coordinated delivery without me managing each person individually. The agency model gave me a technical lead and a team structure without the overhead of recruiting and onboarding individual developers.

The design team was a separate UX/UI agency. Separating design from development was intentional — I wanted design decisions driven by user research and usability testing, not by what was easiest to implement. Having an independent design team created healthy tension between "what's the ideal experience" and "what's feasible to build this sprint."

Freelancers from Upwork and Fiverr handled specialized tasks: content writing, customer support, data labeling for AI training, and marketing collateral. The gig economy gets a lot of criticism, but for a bootstrapped startup, it's an incredibly efficient way to access specialized skills without permanent headcount commitments.

What I Learned About Hiring Remote Talent

Managing a distributed team across multiple agencies, time zones, and engagement models taught me things that no management book covers.

Evaluating portfolios is necessary but insufficient. A beautiful portfolio tells you someone can do good work under ideal conditions. It doesn't tell you whether they can meet deadlines, communicate proactively about blockers, or handle feedback without getting defensive. I learned to weight the trial project more heavily than the portfolio — a paid two-week engagement on a real deliverable reveals more than any number of case studies.

Fixed-price vs. hourly contracts depends on scope clarity. For well-defined deliverables (design a landing page, build this API endpoint), fixed-price contracts align incentives. For exploratory or evolving work (iterate on onboarding flow, debug performance issues), hourly contracts are more honest. I used both, depending on the task.

Milestone payments protect both sides. For larger projects, I structured payments around deliverable milestones rather than time periods. This gave the contractor clear targets and gave me natural checkpoints to evaluate progress and course-correct if needed.

Resolving performance issues quickly is a kindness, not a cruelty. Early in the Jobsolv build, I let underperformance persist too long because I didn't want to have difficult conversations. That's a mistake. The contractor knows they're struggling. The rest of the team knows. Addressing it directly — with specific examples, clear expectations, and a defined improvement timeline — is more respectful than letting it fester.

Product-Led Growth After Launch

The services-to-SaaS transition worked because the product delivered the same outcomes that the services had, at a fraction of the cost and with much greater scale.

The numbers tell the story:

  • 30K+ users — organic growth driven by word of mouth, SEO, and product quality
  • $80K+ total revenue — combining services revenue and SaaS subscriptions
  • Under $0.50 CPA — because most acquisition is organic, the cost per acquisition is negligible compared to paid channels
  • 24%+ win rate in 2025 — candidates using Jobsolv are winning jobs at rates that validate the product's effectiveness

The low CPA is worth emphasizing. When I did experiment with paid acquisition early on, the unit economics didn't work for a bootstrapped company. I've detailed this in the 30K users post. The services phase revenue gave me runway to build the product, and the product's quality drove organic growth that didn't require an advertising budget.

The Validation Framework

If I were advising another founder on validating a SaaS idea, here's the framework I'd recommend based on the Jobsolv experience:

Step 1: Sell the service manually. Don't build anything. Offer to do the thing your product will eventually do, by hand, for individual clients. Charge a premium price — you're testing willingness to pay, and low prices attract low-signal customers.

Step 2: Track outcomes obsessively. The 92.3% interview rate wasn't something I calculated after the fact. I tracked every client's progress in real time, measured time to first interview, recorded which approaches worked and which didn't. This data became the basis for product decisions.

Step 3: Identify the automatable patterns. After serving enough clients manually, you'll see which steps are repeatable and mechanical (automate these first) and which require genuine expertise (these become your AI-augmented features or your premium tier).

Step 4: Build the minimum product that replicates the service outcomes. Not the minimum product you can ship — the minimum product that delivers the same results your manual service delivered. If the product can't match the service outcomes, it's not ready to launch.

Step 5: Use service clients as your launch base. Your first 26 clients (or however many you served) are your best marketing channel. They've seen the results. They'll refer others. And they'll give you honest feedback on whether the product is as good as the service was.

What I'd Do Differently

The services phase was unequivocally the right call. But I'd change a few things in execution.

I'd start with a higher volume of clients at a slightly lower price point — maybe $1,500 instead of $2,000-$3,000. Twenty-six clients gave me enough signal, but 50 would have given me more statistical confidence in the outcomes and more diverse use cases to design for.

I'd formalize the feedback loop earlier. I collected client feedback informally, but I should have structured it into a systematic research process from the first client. The informal feedback was useful but biased toward whatever the most recent client mentioned.

I'd also move faster on the transition. The services phase ran longer than it needed to because I was reluctant to commit to building the product until I felt absolutely certain. In hindsight, I had enough validation after 15-20 clients. The last 6-10 were confirmation, not discovery.

The Bigger Picture

The Jobsolv journey — from manual services to validated SaaS — connects to everything I believe about building products and running experiments. At NRG, I test hypotheses about customer behavior through A/B tests. At Jobsolv, I tested a business hypothesis through a services-first model. The methodology is different, but the principle is the same: validate before you invest.

This is core to Atticus Li's PRISM Method. Whether you're testing a new enrollment flow or testing a new business model, the discipline is the same: define the hypothesis, design the measurement, execute the test, interpret the results honestly, and let the data guide the next decision.

Jobsolv exists because the services phase proved it should. And 30K+ users later, I'm glad I had the patience to validate before building — even when every instinct said to just start coding.

Building something and thinking about the services-to-SaaS path? Reach me at [email protected].

Share this article
LinkedIn (opens in new tab) X / Twitter (opens in new tab)
Atticus Li

Leads applied experimentation at NRG Energy. $30M+ in verified revenue impact through behavioral economics and CRO.