> gtm-0-to-1-launch
Launch new products from idea to first customers. Use when launching products, finding early adopters, building launch week playbooks, diagnosing why adoption stalls, or learning that press coverage does not equal growth. Includes the three-layer diagnosis, the 2-week experiment cycle, and the launch that got 50K impressions and 12 signups.
curl "https://skillshub.wtf/github/awesome-copilot/gtm-0-to-1-launch?format=md"0-to-1 Launch
Launch new products from idea to first customers. The goal isn't headlines — it's finding 10 customers who can't live without you.
When to Use
Triggers:
- "How do we launch this product?"
- "First customer acquisition strategy"
- "We launched but nobody's using it"
- "Product Hunt vs direct outreach?"
- "We have awareness but no conversion"
- "How do I know if this is working?"
Context:
- New product launches
- Feature launches that feel like new products
- Finding first 10-50 customers
- Validating product-market fit
- Diagnosing why early traction stalls
Core Frameworks
1. Press ≠ Growth (The Launch That Got 12 Signups)
The Pattern:
Coordinated a feature launch with full press tour. TechCrunch, VentureBeat, product blogs. Big announcement day.
Result:
- 50K impressions
- 12 signups
- 2 conversions
Why It Failed:
Optimized for media buzz, not user value. The feature wasn't ready for self-serve. It needed education, context, hand-holding. Press gives you eyeballs. But eyeballs without activation = vanity.
What Works Better:
Email 50 target customers directly. "We built [feature] because teams like yours struggle with [problem]. Want early access?" Walk them through setup personally. Get feedback, iterate.
Result: 50 emails → 15 replies (30% reply rate) → 8 trials → 4 conversions (50% trial-to-paid).
The Lesson:
Early customers come from direct outreach, not press coverage. Press matters later (Series A announcement, major milestone). For 0-to-1, it's distraction.
2. The Three-Layer Diagnosis (Why Launches Stall)
The Pattern:
You launched. You have some awareness. But conversion is weak. The problem lives in one of three layers, and each requires a different intervention.
Layer 1: Positioning Problem
Symptoms:
- Messaging sounds like competitors
- Differentiation requires explaining complex technical details
- Buyers see you as interchangeable with alternatives
- Sales conversations get derailed by comparison questions
Diagnosis: You're "fighting an asymmetric war on the wrong front" — competing on features against better-funded companies. Map where competitors claim unique value. Find the position they can't easily copy.
Fix: Stake a claim you can own structurally (not just through product features). Test with outbound messaging before committing product resources.
Layer 2: Experience Problem
Symptoms:
- Strong awareness but weak activation
- Users sign up but don't complete first workflow
- Multiple entry points creating decision paralysis
- Documentation is feature-centric, not outcome-centric
Diagnosis: Flexibility without opinionated defaults is a liability, not a feature. Users face the "paradox of choice" — too many options, not enough guidance to the aha moment.
Fix: Identify 2-3 "undeniable use cases" that deliver immediate value. Restrict onboarding to those specific use cases. Gate advanced features behind a mastery path. Rewrite help content around jobs-to-be-done, not feature lists.
Layer 3: Alignment Problem
Symptoms:
- Team reports being "out of bandwidth" for customers
- Different functions optimize for different metrics
- Every idea has equal weight (no tiebreaker)
- No clear north star connecting activities to outcomes
Diagnosis: "Exploratory mode" — where every initiative has equal priority — becomes destructive when resources are constrained.
Fix: Define a single shared north star. Use it as tiebreaker for every decision: "Does this help us win a customer?" Cut activities that don't ladder up. Make progress visible weekly, not quarterly.
How to Use This:
When a launch stalls, diagnose which layer is broken before throwing resources at it. Fixing experience when the problem is positioning wastes engineering time. Fixing positioning when the problem is internal alignment wastes marketing spend.
3. The First 10 Customers Framework
Principle: First 10 customers are not for revenue. They're for learning.
What You're Learning:
- Does the product actually solve the problem?
- What's the activation flow? (How do they get value?)
- What objections come up? (Price, features, integrations?)
- Who's the real buyer? (Title, role, budget authority?)
- What's the sales cycle? (Days, weeks, months?)
How to Find Them:
Channel 1: Personal Network (first 2-3)
- "I'm building [X], can I get your feedback?"
- Convert to paying customers (don't give away for free — free users give different feedback than paying ones)
Channel 2: Direct Outreach (customers 3-20)
- Build list of 100 target accounts
- Personalize to their specific pain
- Test messaging variants — which angle gets replies?
Channel 3: Ceiling Moment Targeting (highest-intent)
- The highest-intent prospects are people who've already adopted a comparable solution and hit its limits
- They've invested in learning a tool, hit its ceiling, and have low switching costs
- Craft outreach around the limitation: "We see teams that outgrow [incumbent] when they need [capability]. That's what we built."
- These convert 3-5x better than cold outreach because they already understand the problem
Channel 4: Community (developer products)
- "Built [X] to solve [problem], looking for early users"
- Offer white-glove onboarding
- Best for products where users congregate in Slack/Discord/forums
4. The 2-Week Experiment Cycle
The Pattern:
Speed in early stages matters more than perfection. The constraint isn't whether you're right — it's how quickly you can test assumptions and iterate.
How to Execute:
- Frame every test with clear success criteria before starting
- Test one variable per experiment (messaging, channel, pricing, feature)
- Run for 2 weeks maximum — if it's not showing signal by then, it won't
- If it works, allocate 3x resources within a week
- If it doesn't, kill it and move to the next test
- Document what you learned regardless of outcome
The Playbook Rule:
Every successful experiment must become a playbook before scaling. Structure: Goal → Steps → Expected output → Metrics → Risks. If someone unfamiliar can't execute the playbook, it's not documented well enough.
Why This Matters:
One-off wins don't compound. Systematized experiments do. The goal isn't a single launch — it's building a repeatable machine for testing assumptions at speed.
Common Mistake:
Over-planning before testing. Waiting for "perfect" conditions before launching. Staying with failing experiments too long because you've invested emotional energy. Make decisions with 70% information.
5. Partner-Led Market Entry (When You Don't Have Distribution)
The Pattern:
Rather than entering new markets through direct sales alone, use partnerships with established players to accelerate.
How to Execute:
- Identify market leaders in your target segment
- Approach with customer problem, not partnership pitch — "What if your users could access [capability]?" shifts from your need to their need
- Start small: Help them solve one specific problem (narrow integration, not full partnership)
- Prove value with a 3-6 month pilot before asking for broader commitment
- Build reference customers together — reduces their risk
- Leverage their GTM: once integrated, they market to their base
The Supernode Pattern:
Position yourself as the integration hub that other tools naturally connect through. You own critical data or workflows that other platforms need. This compounds — each new partner makes you more valuable to the next.
Category Sequencing:
Don't pursue partnerships everywhere. Dominate 2-3 categories per quarter:
- Lead with genuine use cases: "Our users ask for [partner] integration 50x per month"
- Once you partner with a top player, competitors feel urgency to work with you too
- After 2-3 successful partnerships in a category, create joint customer stories
Common Mistake:
Launching partnerships without clear integration pathways. Expecting partners to drive awareness without support. Treating partnerships as a sales channel rather than platform expansion.
6. PMF Validation Checklist
Product-market fit is when customers pull you forward, not when you push them.
Retention:
- 40%+ of Week 1 users return Week 4
- Usage increasing over time
- Customers renewing without sales push
Organic Growth:
- Word-of-mouth referrals happening
- Customers asking "can I add my team?"
- Inbound interest without paid marketing
Sales Velocity:
- Sales cycles shortening
- Win rates >30% of trials
- Customers saying "we need this now"
Qualitative:
- >40% very disappointed if product went away (Sean Ellis test)
- Customers can articulate what it's for (clear use case)
- Customers advocating publicly
If you don't have these, you don't have PMF yet. Don't scale marketing/sales.
Decision Trees
Why Is Our Launch Stalling?
Do prospects understand what you are?
├─ No → Layer 1: Positioning problem
│ Fix: Test new messaging before changing product
└─ Yes → Continue...
│
Do users activate after signing up?
├─ No → Layer 2: Experience problem
│ Fix: Restrict onboarding to 2-3 use cases, guide to aha moment
└─ Yes → Continue...
│
Is the team aligned on what matters?
├─ No → Layer 3: Alignment problem
│ Fix: Single north star, weekly visibility, cut non-essential
└─ Yes → Keep iterating, you're on the right track
Press Launch or Direct Outreach?
Self-serve ready? (Users get value in <10 min)
├─ No → Direct outreach only (press won't convert)
└─ Yes → Do you have >$1M funding to announce?
├─ Yes → Both (press for awareness, outreach for conversion)
└─ No → Direct outreach first, press later
Common Mistakes
1. Optimizing for headlines instead of activation 50K impressions and 12 signups. Press ≠ growth.
2. No target customer list before launch Spray-and-pray doesn't work at 0-to-1. Build the list of 100 accounts first.
3. Flexibility without defaults Giving users every option paralyzes them. Pick 2-3 undeniable use cases and guide hard.
4. Giving product away for free Free users give polite feedback. Paying users give honest feedback.
5. Scaling before learning First 10 customers are for learning, not revenue. Document everything.
6. Over-planning, under-testing 2-week experiments with clear kill criteria. Move fast, document learnings.
7. Diagnosing the wrong layer Positioning fix when the problem is experience = wasted marketing. Experience fix when the problem is positioning = wasted engineering.
Quick Reference
Three-layer diagnosis: Layer 1: Positioning (messaging sounds like competitors) → Test new messaging Layer 2: Experience (awareness but no activation) → Guide to aha moment Layer 3: Alignment (team scattered) → Single north star, weekly visibility
First 10 customers: Personal network (2-3) → Direct outreach (3-20) → Ceiling moment targeting (highest intent) → Community (developer products)
2-week experiment cycle: Hypothesis → Success criteria → Test (2 weeks max) → Kill or 3x → Document playbook
PMF signals: 40%+ Week 1→4 retention + word-of-mouth + shortening sales cycles + >40% very disappointed
Partner-led entry: Customer problem first → Narrow pilot → Reference customers together → Leverage their GTM
Related Skills
- product-led-growth: Scaling after initial traction
- positioning-strategy: Positioning for launch
- partnership-architecture: Partner-led market entry
Based on launching features that optimized for press and got 12 signups from 50K impressions, diagnosing launch stalls across three companies using the three-layer model, and building the 2-week experiment cycle that turned ad hoc testing into a repeatable machine. Also draws on partner-led market entry across multiple geographies and segments. Not theory — lessons from mistaking vanity metrics for growth and learning to diagnose the actual problem.
> related_skills --same-repo
> write-coding-standards-from-file
Write a coding standards document for a project using the coding styles from the file(s) and/or folder(s) passed as arguments in the prompt.
> workiq-copilot
Guides the Copilot CLI on how to use the WorkIQ CLI/MCP server to query Microsoft 365 Copilot data (emails, meetings, docs, Teams, people) for live context, summaries, and recommendations.
> winmd-api-search
Find and explore Windows desktop APIs. Use when building features that need platform capabilities — camera, file access, notifications, UI controls, AI/ML, sensors, networking, etc. Discovers the right API for a task and retrieves full type details (methods, properties, events, enumeration values).
> winapp-cli
Windows App Development CLI (winapp) for building, packaging, and deploying Windows applications. Use when asked to initialize Windows app projects, create MSIX packages, generate AppxManifest.xml, manage development certificates, add package identity for debugging, sign packages, publish to the Microsoft Store, create external catalogs, or access Windows SDK build tools. Supports .NET (csproj), C++, Electron, Rust, Tauri, and cross-platform frameworks targeting Windows.