> cursor-api-key-management
Configure BYOK API keys for OpenAI, Anthropic, Google, Azure, and custom models in Cursor. Triggers on "cursor api key", "cursor openai key", "cursor anthropic key", "own api key cursor", "BYOK cursor", "cursor azure key".
curl "https://skillshub.wtf/jeremylongshore/claude-code-plugins-plus-skills/cursor-api-key-management?format=md"Cursor API Key Management
Configure Bring Your Own Key (BYOK) for AI model providers in Cursor. BYOK lets you use your own API keys to bypass Cursor's monthly quota, pay per token directly, and access models not included in Cursor's subscription.
Supported Providers
| Provider | Key Format | Models Available |
|---|---|---|
| OpenAI | sk-proj-... or sk-... | GPT-4o, GPT-4o-mini, o1, o3, GPT-5 |
| Anthropic | sk-ant-api03-... | Claude Sonnet, Claude Opus, Claude Haiku |
AIzaSy... | Gemini 2.5 Pro, Gemini Flash | |
| Azure OpenAI | Azure portal key | Any deployed Azure OpenAI model |
| AWS Bedrock | IAM credentials | Claude, Titan, Llama models |
| OpenAI-compatible | Varies | Ollama, LM Studio, Together AI, etc. |
Configuration Steps
OpenAI
- Go to platform.openai.com/api-keys
- Create a new API key (project-scoped recommended)
- In Cursor:
Cursor Settings>Models> checkUse own API key - Paste key in the OpenAI API Key field
- Select model from dropdown (e.g.,
gpt-4o)
Anthropic
- Go to console.anthropic.com/settings/keys
- Create a new API key
- In Cursor:
Cursor Settings>Models> checkUse own API key - Paste key in the Anthropic API Key field
- Select Claude model from dropdown
Google (Gemini)
- Go to aistudio.google.com/apikey
- Create API key
- In Cursor:
Cursor Settings>Models> checkUse own API key - Paste key in the Google API Key field
Azure OpenAI
Azure requires additional configuration beyond a simple API key:
- In Azure Portal: create an Azure OpenAI resource
- Deploy your desired model (e.g.,
gpt-4o) - Note the Endpoint URL and API Key from the resource
- In Cursor:
Cursor Settings>Models:
Azure Configuration:
API Key: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Endpoint: https://your-instance.openai.azure.com
Deployment: your-gpt4o-deployment-name
API Version: 2024-10-21
Custom OpenAI-Compatible Endpoints
For self-hosted models (Ollama, vLLM) or third-party providers:
Cursor Settings>Models>Add Model- Model name: e.g.,
llama-3.1-70b - Check
Override OpenAI Base URL - Enter base URL:
Ollama: http://localhost:11434/v1
LM Studio: http://localhost:1234/v1
Together AI: https://api.together.xyz/v1
- Enter API key if required by the provider
- The model appears in the Chat/Composer model dropdown
What BYOK Covers (and What It Does Not)
BYOK key used: Cursor model (always):
┌──────────────────────┐ ┌──────────────────────┐
│ Chat (Cmd+L) │ │ Tab Completion │
│ Composer (Cmd+I) │ │ Apply from Chat │
│ Agent Mode │ │ (diff application) │
│ Inline Edit (Cmd+K) │ │ │
└──────────────────────┘ └──────────────────────┘
Tab Completion and Apply always use Cursor's proprietary models. You cannot route these through your own API key.
Cost Management
Monitoring Usage
With BYOK, you pay the provider directly. Monitor costs at:
- OpenAI: platform.openai.com/usage
- Anthropic: console.anthropic.com/settings/billing
- Azure: Azure Cost Management portal
Setting Spending Limits
Set monthly spending limits at the provider level:
- OpenAI: Settings > Limits > Set monthly budget
- Anthropic: Settings > Limits > Set spending limit
- Azure: Create budget alerts in Azure Cost Management
Cost-Saving Strategies
- Use Auto mode: Cursor selects cheaper models for simple tasks
- Default to Sonnet/GPT-4o: Reserve Opus/o1 for hard problems
- Shorter context: Use
@Filesnot@Codebasewhen you know the location - Fewer round-trips: Write detailed prompts to reduce back-and-forth
Approximate Token Costs (as of early 2026)
| Model | Input (per 1M tokens) | Output (per 1M tokens) |
|---|---|---|
| GPT-4o | $2.50 | $10.00 |
| GPT-4o-mini | $0.15 | $0.60 |
| Claude Sonnet | $3.00 | $15.00 |
| Claude Opus | $15.00 | $75.00 |
| o1 | $15.00 | $60.00 |
A typical Composer session generating multi-file code uses 5K-20K tokens.
Security Best Practices
Key Storage
Cursor stores API keys locally in its settings database:
- macOS:
~/Library/Application Support/Cursor/ - Linux:
~/.config/Cursor/ - Windows:
%APPDATA%\Cursor\
Keys are stored in the local Cursor configuration, not in project files. They do not sync between machines.
Rotation
- Generate a new key at the provider
- Update the key in
Cursor Settings>Models - Revoke the old key at the provider
- Verify Chat/Composer work with the new key
Team Key Management
For teams using BYOK:
- Individual keys: Each developer uses their own key. Simplest setup, hardest to audit.
- Shared project key: Create a project-scoped key at the provider. Share via secure channel. Track usage per project.
- Azure gateway: Route all requests through a central Azure OpenAI deployment. Full audit logging, spending controls, model governance.
Troubleshooting
| Error | Cause | Fix |
|---|---|---|
401 Unauthorized | Invalid or expired API key | Regenerate key at provider |
429 Rate Limited | Too many requests | Wait, or upgrade provider plan |
403 Forbidden | Key lacks model access | Enable model access at provider |
| Model not appearing | Key not saved or wrong provider | Re-enter key in Cursor Settings |
| Azure connection refused | Wrong endpoint or API version | Verify endpoint URL and version |
| Slow responses with BYOK | Provider rate limits apply | Check provider dashboard |
Enterprise Considerations
- Compliance: BYOK routes requests directly to the provider, bypassing Cursor's infrastructure. Verify this meets your data governance requirements.
- Azure private endpoints: Enterprise Azure deployments can use private endpoints for network-level isolation
- Key rotation policy: Implement quarterly key rotation as standard practice
- SSO + BYOK: Team admins can configure shared BYOK keys via the admin dashboard (Enterprise plan)
Resources
> related_skills --same-repo
> fathom-cost-tuning
Optimize Fathom API usage and plan selection. Trigger with phrases like "fathom cost", "fathom pricing", "fathom plan".
> fathom-core-workflow-b
Sync Fathom meeting data to CRM and build automated follow-up workflows. Use when integrating Fathom with Salesforce, HubSpot, or custom CRMs, or creating automated post-meeting email summaries. Trigger with phrases like "fathom crm sync", "fathom salesforce", "fathom follow-up", "fathom post-meeting workflow".
> fathom-core-workflow-a
Build a meeting analytics pipeline with Fathom transcripts and summaries. Use when extracting insights from meetings, building CRM sync, or creating automated meeting follow-up workflows. Trigger with phrases like "fathom analytics", "fathom meeting pipeline", "fathom transcript analysis", "fathom action items sync".
> fathom-common-errors
Diagnose and fix Fathom API errors including auth failures and missing data. Use when API calls fail, transcripts are empty, or webhooks are not firing. Trigger with phrases like "fathom error", "fathom not working", "fathom api failure", "fix fathom".