> bgpt-paper-search
Search scientific papers and retrieve structured experimental data extracted from full-text studies via the BGPT MCP server. Returns 25+ fields per paper including methods, results, sample sizes, quality scores, and conclusions. Use for literature reviews, evidence synthesis, and finding experimental details not available in abstracts alone.
curl "https://skillshub.wtf/K-Dense-AI/claude-scientific-skills/bgpt-paper-search?format=md"BGPT Paper Search
Overview
BGPT is a remote MCP server that searches a curated database of scientific papers built from raw experimental data extracted from full-text studies. Unlike traditional literature databases that return titles and abstracts, BGPT returns structured data from the actual paper content — methods, quantitative results, sample sizes, quality assessments, and 25+ metadata fields per paper.
When to Use This Skill
Use this skill when:
- Searching for scientific papers with specific experimental details
- Conducting systematic or scoping literature reviews
- Finding quantitative results, sample sizes, or effect sizes across studies
- Comparing methodologies used in different studies
- Looking for papers with quality scores or evidence grading
- Needing structured data from full-text papers (not just abstracts)
- Building evidence tables for meta-analyses or clinical guidelines
Setup
BGPT is a remote MCP server — no local installation required.
Claude Desktop / Claude Code
Add to your MCP configuration:
{
"mcpServers": {
"bgpt": {
"command": "npx",
"args": ["mcp-remote", "https://bgpt.pro/mcp/sse"]
}
}
}
npm (alternative)
npx bgpt-mcp
Usage
Once configured, use the search_papers tool provided by the BGPT MCP server:
Search for papers about: "CRISPR gene editing efficiency in human cells"
The server returns structured results including:
- Title, authors, journal, year, DOI
- Methods: Experimental techniques, models, protocols
- Results: Key findings with quantitative data
- Sample sizes: Number of subjects/samples
- Quality scores: Study quality assessments
- Conclusions: Author conclusions and implications
Pricing
- Free tier: 50 searches per network, no API key required
- Paid: $0.01 per result with an API key from bgpt.pro/mcp
Complementary Skills
Pairs well with:
literature-review— Use BGPT to gather structured data, then synthesize with literature-review workflowspubmed-database— Use PubMed for broad searches, BGPT for deep experimental databiorxiv-database— Combine preprint discovery with full-text data extractioncitation-management— Manage citations from BGPT search results
> related_skills --same-repo
> writing
Use this skill to create high-quality academic papers, literature reviews, grant proposals, clinical reports, and other research and scientific documents backed by comprehensive research and real, verifiable citations. Use this skill whenever the user asks for written output such as a report, paper...etc.
> xlsx
Use this skill any time a spreadsheet file is the primary input or output. This means any task where the user wants to: open, read, edit, or fix an existing .xlsx, .xlsm, .csv, or .tsv file (e.g., adding columns, computing formulas, formatting, charting, cleaning messy data); create a new spreadsheet from scratch or from other data sources; or convert between tabular file formats. Trigger especially when the user references a spreadsheet file by name or path — even casually (like "the xlsx in my
> scikit-learn
Machine learning in Python with scikit-learn. Use when working with supervised learning (classification, regression), unsupervised learning (clustering, dimensionality reduction), model evaluation, hyperparameter tuning, preprocessing, or building ML pipelines. Provides comprehensive reference documentation for algorithms, preprocessing techniques, pipelines, and best practices.
> pytorch-lightning
Deep learning framework (PyTorch Lightning). Organize PyTorch code into LightningModules, configure Trainers for multi-GPU/TPU, implement data pipelines, callbacks, logging (W&B, TensorBoard), distributed training (DDP, FSDP, DeepSpeed), for scalable neural network training.