> competitive-battlecard-generator

Generates and maintains competitive battlecards from win/loss data, competitor intel, G2 reviews, and sales feedback. Use when building battlecards for a specific competitor, updating existing battlecards with new intel, preparing BDRs or AEs for competitive deals, analyzing win/loss patterns against a competitor, or responding to a competitor's product launch or pricing change. Also use when asked to "build a battlecard," "update competitive intel," "how do we beat [competitor]," "what do we sa

fetch
$curl "https://skillshub.wtf/Stallin-Sanamandra/b2b-saas-marketing-skills/competitive-battlecard-generator?format=md"
SKILL.mdcompetitive-battlecard-generator

Competitive Battlecard Generator

A sequential system for building, maintaining, and deploying competitive intelligence as battlecards. Each phase produces a defined output that feeds the next. Phase 6 captures field intel that keeps battlecards current rather than letting them decay into shelf-ware.

Why This Skill Exists

Competitive battlecards at most B2B SaaS companies are one of two things:

  1. A feature comparison matrix that nobody reads. Marketing builds a table comparing 40 features. Checkmarks everywhere. Sales never opens it because it doesn't tell them what to say when a prospect mentions the competitor on a live call.

  2. Tribal knowledge in the heads of 2-3 senior reps. They know what to say because they've lost deals to this competitor before. Nobody else on the team has that context. When those reps leave, the intel leaves with them.

Both fail because they confuse information with enablement. A battlecard isn't a data sheet. It's a decision aid that tells a rep exactly what to say, what to ask, and what to avoid in a competitive deal, specific to the persona they're talking to.

This skill builds battlecards that sales and BDRs actually use, keeps them current with field feedback, and connects them to the rest of your go-to-market stack.

When to Use This Skill

  • Building a battlecard for a specific competitor from scratch
  • Updating an existing battlecard with new competitive intel
  • Responding to a competitor's product launch, pricing change, or acquisition
  • Preparing a BDR or AE for a deal where a specific competitor is involved
  • Analyzing win/loss data to identify competitive patterns
  • Building persona-specific competitive talk tracks
  • Reviewing competitive positioning as part of quarterly planning
  • Onboarding new sales/BDR team members on competitive landscape

When NOT to Use This Skill

  • General market research or industry analysis (this is competitor-specific)
  • Product roadmap planning (competitive intel informs it, but this isn't a product management skill)
  • Pricing strategy (this skill covers how to handle competitor pricing objections, not how to set your own prices)
  • Content creation for comparison/alternative pages (use SEO or content skills for that; this skill produces internal enablement, not external content)
  • Evaluating potential acquisition targets (different analysis framework)

The Battlecard Pipeline: 6 Sequential Phases

PHASE 1              PHASE 2              PHASE 3
Competitor       →   Positioning &    →   Persona-Specific
Intelligence          Differentiation      Battlecard
Gathering             Analysis             Assembly

     ↓                                         ↓

PHASE 6              PHASE 5              PHASE 4
Refresh Cycle    ←   Win/Loss         ←   Deployment &
& Decay               Analysis             Enablement
Prevention

     ↓
     └──→ PHASE 1 (next refresh cycle)

The closed loop: Phase 6 runs on a fixed cadence. Field intel from sales, new competitor moves, and win/loss patterns feed back into Phase 1. Battlecards that aren't refreshed every 90 days get flagged for review or retirement.


Phase 1: Competitor Intelligence Gathering

Input: Competitor name, your product context, existing intel (if any) Output: Structured competitor intelligence brief

Intelligence Source Matrix

Gather from multiple sources. No single source gives you the full picture.

SourceSignal QualityWhat to ExtractUpdate Frequency
Competitor websiteMediumPricing (if public), messaging, feature claims, customer logos, positioning changesMonthly
G2/TrustRadius reviewsHighWhat their customers praise, what they complain about, switching reasons, comparison mentionsMonthly
Your win/loss interviewsVery HighWhy you won, why you lost, what the buyer compared, what mattered mostAfter every closed deal
Sales call recordingsHighWhat prospects say about the competitor unprompted, specific objections, feature comparisons mentionedOngoing
Competitor job postingsMediumProduct direction signals (hiring for a new category = building it), tech stack, growth signalsQuarterly
Competitor blog/changelogMediumFeature launches, messaging shifts, vertical focus changesMonthly
Industry analyst reportsMediumMarket positioning, quadrant placement, analyst perspective on strengths/weaknessesAnnually or when published
Social media / LinkedInLow-MediumExecutive messaging, thought leadership themes, customer engagement, employee sentimentMonthly
Patent/technical filingsLowLong-term product direction. Useful for enterprise competitors.Quarterly
Former competitor employeesVery High (if available)Internal roadmap, culture, sales process, known weaknessesOpportunistic

Competitor Intelligence Brief Template

COMPETITOR INTELLIGENCE BRIEF

Competitor: ________________________
Last updated: ________________________
Updated by: ________________________
Next review date: ________________________ (max 90 days)

COMPANY OVERVIEW
- Founded: ________
- HQ: ________
- Funding / public status: ________
- Estimated revenue / ARR: ________
- Employee count: ________ (growth trend: ________)
- Key investors / parent company: ________

PRODUCT OVERVIEW
- Core product: [One sentence description]
- Primary use case: [What most customers buy it for]
- Secondary use cases: [What else it does]
- Platform / architecture: [Cloud-native, on-prem, hybrid]
- Key integrations: [Tech stack dependencies]

PRICING (as known)
- Model: [Per user / per asset / per framework / flat rate / usage-based]
- Entry price: $________ [source: ________]
- Mid-market price: $________ [source: ________]
- Enterprise price: $________ [source: ________]
- Free tier / trial: [Yes/No, details]
- Contract terms: [Annual only, monthly available, multi-year discounts]
- Known discounting behavior: [Aggressive, standard, rigid]

TARGET MARKET
- Primary segment: [SMB / Mid-market / Enterprise]
- Key verticals: [Industries they focus on]
- Geo focus: [Where they sell most]
- ICP overlap with us: [High / Medium / Low, specifics]

MESSAGING & POSITIONING
- Primary value prop: [How they describe themselves]
- Key differentiators (their claimed): [What they say makes them different]
- Messaging themes: [Risk, speed, cost, compliance, developer experience, etc.]
- Recent messaging shifts: [Any changes in how they position]

KNOWN STRENGTHS (evidence-based, not assumed)
1. [Strength] — Evidence: [G2 reviews, win/loss data, customer quotes]
2. [Strength] — Evidence: ________
3. [Strength] — Evidence: ________

KNOWN WEAKNESSES (evidence-based, not assumed)
1. [Weakness] — Evidence: [G2 complaints, lost deals to us, technical gaps]
2. [Weakness] — Evidence: ________
3. [Weakness] — Evidence: ________

RECENT MOVES (last 90 days)
- Product: [New features, acquisitions, partnerships]
- Pricing: [Changes, new tiers, discounting patterns]
- Go-to-market: [New campaigns, verticals, channels]
- People: [Key hires, departures, leadership changes]

WHAT THEIR CUSTOMERS SAY (verbatim from G2/reviews)
- Positive: "[Quote]" — [Source]
- Positive: "[Quote]" — [Source]
- Negative: "[Quote]" — [Source]
- Negative: "[Quote]" — [Source]

WHAT OUR PROSPECTS SAY ABOUT THEM (from sales calls)
- "[Quote]" — [Persona, deal stage, outcome]
- "[Quote]" — [Persona, deal stage, outcome]

Intelligence Quality Rules

  1. Cite everything. Every strength and weakness needs a source. "We think they're weak at X" is an opinion. "Three G2 reviews from mid-market CISOs mention X as a limitation" is intel.

  2. Separate what they claim from what customers say. Their website says "seamless integration." Their G2 reviews say "integration took 3 months." Both matter. The gap between the two is your ammunition.

  3. Date everything. Intel from 12 months ago is unreliable. Pricing from 6 months ago may have changed. Mark the date on every data point.

  4. Don't invent weaknesses. If you don't have evidence of a weakness, don't list it. Reps who use unverified claims in deals lose credibility when the prospect checks.


Phase 2: Positioning & Differentiation Analysis

Input: Competitor intelligence brief (Phase 1), your product context Output: Differentiation map with positioning strategy

Differentiation Framework

Don't list 40 features. Identify the 3-5 dimensions that actually decide deals.

DIFFERENTIATION MAP

Competitor: ________________________
Analysis date: ________________________

DIMENSION 1: ________________________
Us: [Our position on this dimension, with evidence]
Them: [Their position, with evidence]
Buyer impact: [Why this matters to the buyer, not why it matters to us]
Verdict: [Advantage us / Advantage them / Parity]

DIMENSION 2: ________________________
Us: ________
Them: ________
Buyer impact: ________
Verdict: ________

DIMENSION 3: ________________________
Us: ________
Them: ________
Buyer impact: ________
Verdict: ________

DIMENSION 4: ________________________
Us: ________
Them: ________
Buyer impact: ________
Verdict: ________

DIMENSION 5: ________________________
Us: ________
Them: ________
Buyer impact: ________
Verdict: ________

SUMMARY
Dimensions where we win: ________
Dimensions where they win: ________
Dimensions at parity: ________
Our best competitive angle: [The 1-2 dimensions where we have the
strongest advantage AND the buyer cares the most]

How to Pick the Right Dimensions

The dimensions should come from deal data, not from your product team's feature list.

Good Dimensions (decide deals)Bad Dimensions (don't decide deals)
Time to value / implementation speedNumber of features
Framework coverage depthUI aesthetics
Evidence collection automationCompany founding date
Multi-framework overlap handlingNumber of employees
Integration with buyer's existing stackNumber of customers (unless dramatically different)
Support quality and responsivenessAwards or analyst mentions
Total cost of ownership (not just license)Technology stack choice

The test: Ask your last 5 closed-won and 5 closed-lost reps: "What was the deciding factor?" If a dimension doesn't appear in those answers, it doesn't belong on the battlecard.

Positioning Strategy

Based on the differentiation map, pick one of three competitive strategies:

StrategyWhen to UseHow It Sounds
Head-to-headYou win on the same dimensions they compete on"We do what they do, but [faster / deeper / cheaper / with better support]."
ReframeYou lose on their dimension but win on a different one that matters more"They optimize for [X]. We optimize for [Y]. Here's why [Y] matters more for your situation."
NicheYou dominate a specific segment or use case they don't serve well"For [your specific situation], we're purpose-built. They're a general tool being used for a specific job."

Most competitive deals are won by reframing, not by head-to-head comparison. If you're competing on their best dimension, you've already lost the positioning battle.


Phase 3: Persona-Specific Battlecard Assembly

Input: Differentiation map (Phase 2), competitor intel (Phase 1) Output: Battlecards formatted for each persona in the buying committee

Why Persona-Specific Matters

A CISO evaluating your product against a competitor asks different questions than a CTO evaluating the same comparison. The CISO wants to know about risk coverage and audit readiness. The CTO wants to know about engineering burden and integration overhead. Same competitor, different battlecard.

Battlecard Template (Per Persona)

COMPETITIVE BATTLECARD

Competitor: ________________________
Persona: ________________________ [CISO / CTO / Compliance Mgr / etc.]
Version: ________________________
Last updated: ________________________

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

QUICK REFERENCE (for live calls)

When the prospect mentions [Competitor], say:
"[One sentence acknowledgment that doesn't trash the competitor.
Example: 'They're a solid option. A lot of our customers evaluated
them too. What specifically appealed to you about their approach?']"

Then ask:
"[One diagnostic question that exposes the competitor's weakness
without naming it. Example: 'How important is [dimension where
we win] for your team?']"

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

OUR 3 ADVANTAGES FOR THIS PERSONA

1. [Advantage]: [One sentence. Why this persona cares.]
   Proof: [Customer quote, metric, case study specific to this persona]

2. [Advantage]: [One sentence.]
   Proof: ________

3. [Advantage]: [One sentence.]
   Proof: ________

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

THEIR 2 ADVANTAGES (be honest)

1. [Advantage]: [One sentence. Acknowledge it.]
   Our counter: [How to reframe this so it's less decisive.
   Not "that doesn't matter" but "here's the tradeoff they made
   to get that advantage."]

2. [Advantage]: ________
   Our counter: ________

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

LANDMINES TO SET

Questions to ask the prospect BEFORE they evaluate the competitor.
These seed criteria that favor you.

1. "How does your team handle [process where we're strong]?"
2. "What's your timeline for [outcome where we're faster]?"
3. "Have you thought about [dimension they'll struggle with]?"

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

OBJECTION RESPONSES (persona-specific)

"[Competitor] is cheaper."
→ [Response framework. Not "we're worth more." Instead: "What's
included in that price? When our customers compared total cost
including [implementation / maintenance / FTE time], the gap
[narrowed / reversed]."]

"[Competitor] has [feature we don't have]."
→ [Response. Acknowledge, then redirect: "That's true. The
tradeoff is [what they sacrificed to build that]. For teams
that prioritize [your strength], that tradeoff matters."]

"We're already using [Competitor]."
→ [Response. Don't pitch a rip-and-replace. Instead: "What's
working well? What do you wish was different?" Find the gap.
If no gap exists, this isn't your deal right now.]

"[Competitor] has more customers / is bigger."
→ [Response. "They do. For teams in [your niche / segment /
use case], we have [X] customers including [names if allowed].
The question is whether a bigger company optimizes for your
specific needs or for the average of all their customers."]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

TRAP QUESTIONS TO WATCH FOR

Things the competitor may have coached the prospect to ask you.

1. "[Question]" — Why they ask: [What the competitor wants to expose]
   How to respond: [Honest answer that doesn't concede the deal]

2. "[Question]" — Why they ask: ________
   How to respond: ________

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

PROOF POINTS FOR THIS PERSONA

| Customer | Similar to Prospect? | Key Metric | Quote |
|----------|---------------------|------------|-------|
| | [Industry, size, use case] | | |
| | | | |
| | | | |

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

DO NOT SAY

1. [Specific claim about competitor that is unverified or legally risky]
2. [Trash talk or subjective insults about competitor quality]
3. [Outdated information that has been corrected]

Battlecard Honesty Rules

  1. Always include their advantages. A battlecard that says you win on every dimension is a battlecard nobody trusts. Sales reps know when you're hiding the competitor's strengths. Include them and teach reps how to handle them.

  2. Never fabricate weaknesses. If you don't have evidence, don't include it. A rep who claims "their platform goes down constantly" and the prospect says "actually we've had zero downtime issues" just destroyed your credibility.

  3. Never instruct reps to trash the competitor. The framing is always "here's the tradeoff" not "they're bad." Prospects respect sellers who acknowledge competitor strengths. They distrust sellers who only attack.

  4. Include a "Do Not Say" section. Legal risks, outdated claims, and unverified rumors go here. Protect reps from themselves.


Phase 4: Deployment & Enablement

Input: Persona-specific battlecards (Phase 3) Output: Trained sales/BDR team with accessible battlecards

Deployment Checklist

Building the battlecard is half the work. Getting reps to actually use it is the other half.

StepActionOwnerTimeline
1Store battlecards in a single, searchable location (not buried in Google Drive)Marketing opsBefore enablement session
2Run a 30-minute enablement session per competitor (not a deck review; a live roleplay)Product marketing / demand gen leadWithin 1 week of battlecard release
3Roleplay the top 3 objection scenarios with BDRs and AEsSales managerDuring enablement session
4Share the "Quick Reference" section as a Slack-accessible snippetMarketing opsSame day as enablement
5Add battlecard link to CRM deal record template (so reps see it when a competitor is tagged)Marketing opsWithin 1 week
6Collect initial feedback from reps after first 2 weeks of useProduct marketing2 weeks post-launch

Enablement Session Structure (30 minutes per competitor)

COMPETITIVE ENABLEMENT SESSION

Competitor: ________________________
Duration: 30 minutes
Attendees: BDR team, AEs, sales leadership

SECTION 1: CONTEXT (5 minutes)
- Who is this competitor (1 minute)
- Where we see them in deals: which segments, which personas,
  how often (2 minutes)
- Our win rate against them and what decides it (2 minutes)

SECTION 2: THE BATTLECARD (10 minutes)
- Walk through the 3 advantages (with proof points)
- Walk through their 2 advantages (with counters)
- Show the landmine questions

SECTION 3: LIVE ROLEPLAY (15 minutes)
- Scenario 1: Prospect says "We're also looking at [Competitor]"
  One rep plays prospect, another plays seller. Group gives feedback.
- Scenario 2: Prospect says "[Competitor] is cheaper"
  Same format.
- Scenario 3: Prospect says "We're already using [Competitor]"
  Same format.

OUTPUT: Reps leave with the Quick Reference card and confidence
to handle the top 3 competitive scenarios.

Accessibility Rules

RuleRationale
Battlecards must be accessible in under 10 seconds during a live callIf a rep can't find it fast, they won't use it
Store in the tool reps already use (Slack, CRM, Notion, Guru, Gong)Don't create a new destination. Embed in existing workflow.
Mobile-friendly formatReps take calls on their phones. A PDF that requires pinch-to-zoom is useless.
Searchable by competitor name"How do we beat [X]" should return the right battlecard instantly
Version-datedReps need to know if they're looking at current intel or 6-month-old data

Phase 5: Win/Loss Analysis

Input: Closed deal data (won and lost), sales feedback Output: Competitive win/loss patterns that update battlecards

Win/Loss Data Collection

Capture this for every competitive deal, not just the ones you lost.

WIN/LOSS COMPETITIVE DATA

Deal: ________________________
Outcome: Won / Lost
Competitor(s) in deal: ________________________
Decision-maker persona: ________________________
Segment: [C1 / C2 / Enterprise]
ACV: $________
Sales cycle length: ________ days

PRIMARY REASON FOR OUTCOME (pick one):
[ ] Price / total cost
[ ] Product capability / feature gap
[ ] Integration / technical fit
[ ] Ease of use / user experience
[ ] Implementation speed / time to value
[ ] Support / customer success quality
[ ] Brand / trust / market presence
[ ] Existing relationship with competitor
[ ] Champion influence
[ ] Procurement / legal / security requirements
[ ] Other: ________________________

DETAILS (from rep or buyer interview):
What did the buyer say was most important? ________________________
What did they say about us vs. competitor? ________________________
If lost: What would have changed the outcome? ________________________
If won: What almost made them choose the competitor? ________________________

QUOTE (if available):
"________________________" — [Buyer persona, anonymous company description]

Win/Loss Pattern Analysis

Run quarterly. Look for patterns, not anecdotes.

WIN/LOSS COMPETITIVE ANALYSIS

Competitor: ________________________
Period: Q__ 20__
Deals analyzed: __ won, __ lost

WIN RATE VS. THIS COMPETITOR: __%

TOP 3 REASONS WE WON:
1. [Reason] — appeared in __% of wins
2. [Reason] — appeared in __% of wins
3. [Reason] — appeared in __% of wins

TOP 3 REASONS WE LOST:
1. [Reason] — appeared in __% of losses
2. [Reason] — appeared in __% of losses
3. [Reason] — appeared in __% of losses

PATTERNS BY PERSONA:
- When decision-maker is [Persona A]: win rate __% (we win because ________)
- When decision-maker is [Persona B]: win rate __% (we lose because ________)

PATTERNS BY SEGMENT:
- C1 (SMB) deals: win rate __%
- C2 (Mid-market) deals: win rate __%

PATTERNS BY DEAL SIZE:
- Deals under $__K ACV: win rate __%
- Deals over $__K ACV: win rate __%

BATTLECARD UPDATES NEEDED:
1. [What to add/change based on this data]
2. [What to add/change]
3. [What to remove because it's no longer accurate]

Win/Loss Interview Guide

The best competitive intel comes from buyers who chose the competitor, not from your own team's speculation. Run these interviews within 30 days of a lost deal.

WIN/LOSS INTERVIEW GUIDE (15-20 minutes)

Conducted by: [Someone other than the rep who owned the deal.
Buyers are more honest with a neutral party.]

OPENING:
"Thanks for taking the time. We're trying to improve our product and
our process. Your honest feedback helps, and there are no wrong answers."

QUESTIONS:

1. "Walk me through your evaluation process. How did you decide which
   vendors to look at?"
   [Understand how they found you and the competitor]

2. "What were the 2-3 criteria that mattered most in your decision?"
   [Don't suggest criteria. Let them name them unprompted.]

3. "How did [Competitor] perform against those criteria vs. us?"
   [Get the specific comparison on their terms, not yours]

4. "Was there a moment in the evaluation where you felt one option
   pulled ahead? What was it?"
   [Identify the tipping point. This is the most valuable insight.]

5. "How did pricing factor into the decision?"
   [If price was the decider, dig into whether it was license cost
   or total cost including implementation and maintenance]

6. "If you could change one thing about our product or our process
   during the evaluation, what would it be?"
   [Product feedback AND sales process feedback in one question]

7. "Anything else you'd want us to know?"
   [Open-ended. Sometimes the most useful insight comes here.]

Phase 6: Refresh Cycle & Decay Prevention

Input: Field intel (Phase 5), competitor monitoring (Phase 1 sources) Output: Updated battlecards on a fixed cadence

Refresh Cadence

TriggerActionTimeline
Scheduled refreshFull battlecard review against latest intelEvery 90 days, non-negotiable
Competitor product launchUpdate product overview, assess impact on differentiation map, update objection responsesWithin 1 week of announcement
Competitor pricing changeUpdate pricing section, recalculate TCO comparison, update pricing objection responseWithin 1 week of confirmation
Competitor acquisition / mergerReassess entire competitive landscape. May require new battlecard or retirement of old one.Within 2 weeks
Win/loss pattern shiftIf win rate against a competitor changes by >10% in a quarter, investigate and updateWithin 2 weeks of quarterly analysis
New evidence from salesUpdate specific sections (quotes, objection responses, proof points) as intel arrivesWithin 1 week of receiving intel
Competitor messaging shiftUpdate positioning analysis, check if your landmine questions still workWithin 2 weeks

Decay Prevention Rules

Battlecards without maintenance become liabilities. A rep using 6-month-old pricing data or a discontinued feature comparison in a live deal does more damage than having no battlecard at all.

BATTLECARD HEALTH CHECK (run quarterly)

For each active battlecard:

[ ] Last updated date is within 90 days
    If not: schedule immediate refresh or retire

[ ] Pricing data verified within 60 days
    If not: check competitor website, G2, recent deal intel

[ ] All customer quotes and proof points are from current customers
    (not churned accounts)
    If not: replace with current references

[ ] Win/loss data reflects the last 2 quarters, not historical averages
    If not: rerun win/loss analysis

[ ] At least 2 reps have given feedback on battlecard accuracy
    in the last quarter
    If not: collect feedback before next refresh

[ ] No items in "Do Not Say" section that should now be in the
    main battlecard (competitor fixed a weakness, or claim is
    now verified)
    If not: update accordingly

HEALTH STATUS:
[ ] Green: all checks pass. Schedule next review in 90 days.
[ ] Yellow: 1-2 checks failed. Fix within 2 weeks.
[ ] Red: 3+ checks failed. Battlecard unreliable. Pull from
    circulation until refreshed.

Competitor Monitoring Automation

Where possible, automate the intel gathering so refreshes are data-informed, not memory-dependent.

SourceAutomationTool Options
Competitor website changesPage change monitoringVisualping, ChangeTower, or manual monthly screenshot
G2 new reviewsAlert on new reviews for competitorG2 alerts (if available on your plan)
Job postingsTrack new roles with keywordsLinkedIn alerts, Otta, or manual quarterly check
News / pressGoogle Alerts for competitor nameGoogle Alerts, Feedly
Pricing page changesPage change monitoring on pricing URLVisualping on their pricing page specifically
Social media / LinkedInTrack competitor company page and key execsLinkedIn notifications, manual monthly check

Integration Notes

With Other Skills in This Repo

  • BDR Enablement Generator: Phase 6 objection handling and Phase 4 message templates should reference battlecard content. When a BDR encounters a competitive deal, they pull from the battlecard's Quick Reference and persona-specific objection responses. The BDR skill's monthly performance review (Phase 7) captures competitor mentions that feed back into battlecard updates here.

  • ABM Program Orchestrator: Phase 2 account selection includes competitive displacement opportunities. The battlecard's intel on competitor weaknesses and contract renewal timing directly informs which accounts to target for ABM and what messaging angle to lead with.

  • GRC Messaging Guardrails: All competitive claims referencing compliance frameworks must be validated. Saying "[Competitor] doesn't support HIPAA" when they actually do, or saying "we're SOC 2 certified" (attestation, not certification) in a competitive context, is a credibility-destroying error in front of the exact buyer you're trying to win.

  • Pipeline Attribution Narrator: Competitive deal outcomes should be trackable in attribution data. Tag competitive deals in CRM so you can measure whether your win rate differs by channel source. If ABM-sourced competitive deals have a higher win rate than inbound competitive deals, that informs both battlecard deployment and channel allocation.

  • Marketing Ops SOP Generator: Battlecard distribution and version control follow the change management process from Phase 7. Updates to battlecards should be logged and communicated through the same channels as other marketing ops changes.

With Your Tech Stack

  • HubSpot / CRM: Create a custom deal property for "Competitor in Deal" (dropdown). This enables win/loss analysis by competitor in Phase 5. Add a required field for competitive loss reason on closed-lost deals.

  • Gong / Chorus / call recording: Keyword alerts for competitor names surface competitive mentions automatically. Use these as a continuous intel source for Phase 6 refreshes.

  • Slack: Create a #competitive-intel channel. Reps post competitor mentions and buyer quotes in real-time. Marketing aggregates these during refresh cycles. Pin the current battlecard Quick Reference in the channel.

  • Guru / Notion / internal wiki: Store full battlecards here. The Quick Reference cards go in Slack. Link CRM deal records to the relevant battlecard so reps can access it from the deal they're working.


Common Battlecard Mistakes

MistakeWhat HappensHow to Avoid
Feature comparison matrix instead of battlecard40-row table nobody reads. Reps still wing it on calls.Focus on 3 advantages, 2 honest disadvantages, and specific talk tracks. Less is more.
Same battlecard for every personaCISO gets CTO talking points. CTO gets compliance language. Neither resonates.Build persona-specific cards. At minimum: technical evaluator, economic buyer, day-to-day user.
Only listing your advantagesReps get blindsided when prospects mention competitor strengths. Credibility lost.Always include their advantages with honest counters. Reps who acknowledge competitor strengths build more trust.
Building it and never updating itPricing is 8 months old. Feature they "lacked" was shipped 3 months ago. Rep uses outdated claim, prospect corrects them.90-day refresh cycle. Phase 6. Non-negotiable.
No field validationBattlecard is built from marketing's assumptions, not from what buyers actually say.Win/loss interviews (Phase 5) are the ground truth. Build from deal data, not product marketing's positioning deck.
Storing it somewhere nobody looksBeautiful battlecard in a Google Drive folder with 200 other docs. Usage rate: 0%.Embed in the tools reps already use: Slack, CRM, Gong, Guru. If they have to go find it, they won't.
Trashing the competitorProspect respects the competitor. Your rep insults them. Prospect disqualifies you on trust.Never trash. Always reframe. "They optimize for X, which makes sense for some teams. We optimize for Y, which matters more when [prospect's situation]."
No enablement sessionBattlecard emailed to the team with "please review." Nobody reviews it.30-minute enablement session with live roleplay. Phase 4. It's the difference between a document and a capability.

Validation Checklist

Run this before deploying any new or refreshed battlecard.

Intelligence Quality

  • All strengths and weaknesses have cited evidence (not assumptions)
  • Pricing data sourced and dated within 60 days
  • At least 3 G2/review data points included
  • At least 2 win/loss data points from your own deals
  • "Recent Moves" section current within 30 days
  • No unverified claims in the battlecard

Battlecard Structure

  • Quick Reference section is usable in under 10 seconds on a live call
  • 3 advantages with proof points per persona
  • 2 honest competitor advantages with counters
  • Landmine questions test well (ask a rep if they'd use them)
  • Objection responses cover the top 3-4 competitive objections
  • "Do Not Say" section included
  • Persona-specific versions for at least 2 key personas

Deployment Readiness

  • Stored in accessible, searchable location
  • Enablement session scheduled within 1 week
  • Roleplay scenarios prepared for session
  • Quick Reference shared in Slack / team channel
  • CRM deal template links to battlecard
  • Feedback collection planned for 2 weeks post-launch

Maintenance

  • Next refresh date set (max 90 days)
  • Competitor monitoring alerts configured
  • Win/loss data collection process active for competitive deals
  • Decay prevention health check scheduled quarterly

Changelog

  • v1.0 (March 2026): Initial release. 6-phase sequential pipeline, intelligence gathering framework, differentiation analysis, persona-specific battlecard templates, deployment and enablement playbook, win/loss analysis system, refresh cycle with decay prevention.

> related_skills --same-repo

> pipeline-attribution-narrator

Builds multi-touch pipeline attribution models and generates stakeholder-ready narratives from campaign performance data. Use when analyzing channel contribution to pipeline, preparing monthly/quarterly revenue reports for leadership, diagnosing pipeline misses, modeling budget reallocation scenarios, or translating raw attribution data into board-deck language. Designed for B2B SaaS demand gen teams that report on pipeline and revenue, not leads or MQLs.

> marketing-ops-sop-generator

Generates standard operating procedures for B2B SaaS marketing operations: campaign naming conventions, UTM governance, tool administration, workflow QA, data hygiene, lead routing, and incident response playbooks. Use when setting up or auditing marketing ops processes, onboarding a new marketing ops hire, standardizing campaign tracking, fixing broken lead routing, cleaning up HubSpot or CRM data, building QA checklists for campaign launches, or creating incident response procedures for market

> grc-messaging-guardrails

Validates compliance, security, and GRC terminology in marketing copy. Enforces accurate claims, prevents common mistakes (e.g., calling SOC 2 a "certification"), and applies risk-first narrative framing for B2B SaaS audiences. Use when writing or reviewing any marketing content that references compliance frameworks, security standards, regulatory requirements, or audit processes. Also use when creating ads, landing pages, emails, case studies, or sales collateral for GRC/cybersecurity B2B SaaS

> bdr-enablement-generator

Generates account research briefs, personalized outreach sequences, persona-specific talk tracks, and objection handling frameworks for B2B SaaS BDR teams. Use when preparing BDR outreach for target accounts, building prospecting sequences, creating call scripts, developing objection responses, onboarding new BDRs, or reviewing outreach performance. Also use when asked to "write outreach emails," "build a prospecting sequence," "create a call script," "handle objections," or "research an account

┌ stats

installs/wk0
░░░░░░░░░░
github stars1
░░░░░░░░░░
first seenMar 18, 2026
└────────────

┌ repo

Stallin-Sanamandra/b2b-saas-marketing-skills
by Stallin-Sanamandra
└────────────

┌ tags

└────────────