Skip to content

How to Install MCP Servers in Cursor

Harsh Desai

Harsh Desai

·13 min read

In this tutorial, you will install and configure Model Context Protocol (MCP) servers in Cursor, enabling the AI agent to access external tools such as databases, Git repositories, APIs, file systems, and cloud services directly within your coding workflow. You will set up both local stdio and remote HTTP servers, test them with real prompts like querying production Postgres schemas or fetching GitHub PR diffs, and troubleshoot common issues. Follow these steps to bridge Cursor with your tech stack in under 10 minutes, creating an AI that pulls live data without browser switches or manual copies.

TL;DR

MCP servers connect Cursor's AI to external development tools, allowing seamless interactions like querying live databases, pulling GitHub issues, inspecting Docker containers, or reading Slack threads without leaving the IDE. Setup takes about 10 minutes for your first server, with subsequent ones faster via copy-paste configs. You need Cursor v0.40 or later, basic command-line skills, Node.js (v20+) for many packages, and optionally Docker (v24+) or ngrok for HTTP exposure. The result is an AI that understands your full environment natively, reducing context switches, minimizing errors in agent-driven coding, and accelerating tasks from debugging to documentation lookups. Start with a local stdio server for quick wins like a Postgres MCP server, then scale to remote HTTP for teams using services like Supabase or Vercel deployments. Verify each step with Cursor's built-in MCP logs for instant feedback.

What You'll Build

This guide walks you through a complete MCP integration in Cursor. By the end, your Cursor instance will run a functional MCP server that exposes tools to the AI agent. For example, connect a Postgres MCP server to let the agent read schemas, run SELECT queries on your production database, or analyze query performance metrics. Or integrate a GitHub MCP server to fetch issues, PRs, repo metadata, commit histories, and even generate release notes during code reviews.

Imagine debugging a live issue: prompt the AI with "Check the latest errors in our prod Postgres logs from the last hour," and it queries directly via the MCP tool, returns formatted results inline with row counts and timestamps, and suggests optimized SQL fixes based on execution plans. For frontend devs, add a Vercel MCP to deploy previews or check deployment statuses mid-session. This eliminates manual tab-switching between IDE, browser, CLI, and dashboards. We'll cover both local stdio servers (ideal for solo devs with tools like local Redis or file watchers) and remote HTTP servers (for shared services like team GitLab instances or AWS RDS endpoints). Test with real prompts in Cursor's chat (Cmd/Ctrl + L) or Composer (Cmd/Ctrl + I) mode. The setup persists across sessions and projects, with tools available globally. Advanced users can chain tools: "Fetch GitHub issue #123, query related Postgres rows, summarize in Markdown." Expect 5-10x faster iterations once fluent.

What You'll Need

Prepare these tools before starting. All are free and widely available. Verify each with provided commands to avoid mid-setup blocks.

  • Cursor IDE (v0.40+): Download the latest from cursor.com. Verify version in Help > About Cursor. Older versions lack full MCP support or have buggy handshakes. Update via Help > Check for Updates if needed.
  • Terminal: Use macOS Terminal, Linux bash/zsh, or Windows PowerShell/WSL. Test with echo $PATH (should show /usr/local/bin, /usr/bin) and pwd to confirm home access. WSL users: ensure wsl --install if missing.
  • Node.js/NPM (v20+): Install from nodejs.org (LTS recommended). Run node --version (expect 20.x.x) and npm --version (10.x+). Most MCP servers distribute via npm packages like @modelcontextprotocol/server-github or @mcp/server-postgres. Test: npx --yes cowsay "Node ready".
  • Docker (optional, v24+): For containerized servers like databases or Python runtimes. Get it from docker.com. Run docker --version and docker run hello-world to verify. Useful for air-gapped Postgres: docker pull postgres:16. Allocate 4GB RAM in settings for smooth runs.
  • Optional extras: Git (git --version), ngrok (ngrok authtoken your_token for HTTP tunnels), API keys (GitHub: github.com/settings/tokens with scopes like repo, read:org; Postgres: connection string with SSL mode), and jq (brew install jq or equivalent) for JSON validation.

No advanced permissions required. Total setup time: 5 minutes if prerequisites are met. Budget 2 minutes per verification command.

Step 1: Locate your Cursor MCP config file

Cursor centralizes MCP configurations in a single JSON file, mcp.json. This file lists all servers, their transports, commands, and tools. Changes here trigger full reloads on IDE restart.

macOS/Linux: Navigate to ~/.cursor/mcp.json. Use ls -la ~/.cursor/ in terminal to list. Common variants: ~/.config/cursor/mcp.json on some Linux distros. If missing, create the directory with mkdir -p ~/.cursor and touch the file: touch ~/.cursor/mcp.json. Open with open ~/.cursor/mcp.json or cursor ~/.cursor/mcp.json.

Windows: Path is %USERPROFILE%\.cursor\mcp.json (e.g., C:\Users\YourName\.cursor\mcp.json). In PowerShell: Test-Path $env:USERPROFILE\.cursor\mcp.json; if false, New-Item -Path "$env:USERPROFILE\.cursor" -ItemType Directory -Force; New-Item -Path "$env:USERPROFILE\.cursor\mcp.json" -ItemType File. Open: cursor $env:USERPROFILE\.cursor\mcp.json.

Open in Cursor or VS Code: code ~/.cursor/mcp.json. Initial content should be an empty object:

{
  "mcpServers": {}
}

Validate syntax early with a JSON linter like jq . ~/.cursor/mcp.json (install jq via brew install jq, apt install jq, or choco install jq). Errors show as "parse error." Backup original: cp ~/.cursor/mcp.json ~/.cursor/mcp.json.bak. Pro tip: Use Cursor's JSON language server for real-time linting.

Step 2: Choose your MCP server transport type

Cursor communicates with MCP servers via two transports. Select based on your use case, latency needs, and deployment scale. Here's a comparison:

FeatureStdioStreamable HTTP
LatencySub-ms (local pipes)10-100ms (network)
Setup1 line in JSONExpose URL/port
SharingLocal onlyTeams/remote
SecurityProcess isolationHeaders/OAuth
Use caseLocal DB/GitCloud/SaaS
  1. Stdio (Standard I/O): Cursor launches the server as a subprocess, piping data via stdin/stdout. Pros: Zero network latency, simple for local tools like GitHub MCP, Postgres on localhost:5432, or file system watchers. Cons: Tied to one machine; restarts kill processes. Environment vars pass securely.

  2. Streamable HTTP: Cursor connects to a URL (localhost:8000 or https://mcp.yourdomain.com). Pros: Scalable for teams, works behind proxies/VPNs, hot-reloadable. Cons: Requires server exposure (use ngrok: ngrok http 8000), firewall rules. Ideal for Dockerized databases, AWS Lambda gateways, or SaaS like Natoma.

Decision tree:

  • Local dev tool (e.g., npm dev server)? Stdio.
  • Team/shared (e.g., prod Redis cluster)? HTTP.
  • Hybrid: Stdio for dev, HTTP for staging. Example: Stdio for Postgres on localhost; HTTP for cloud Redis instance via wss://redis.example.com/mcp.

Preview configs in Steps 3-4. Test transport compatibility: Start simple stdio first.

Step 3: Configure a local Stdio MCP server

Stdio servers shine for quick, local integrations with zero infrastructure. We'll use a GitHub MCP server as an example, exposing tools like list_repos, get_issue, search_code, create_pr, and get_pull_request_diff.

  1. Generate token: Go to github.com/settings/tokens > Fine-grained > Select repos > Generate.
  2. Edit mcp.json and add under mcpServers:
{
  "mcpServers": {
    "github": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_TOKEN": "ghp_YourToken123abc"
      }
    }
  }
}

Breakdown:

  • type: "stdio" (required).
  • command: Runtime executable (npx auto-installs/installs on first run).
  • args: Package name/flags (-y skips npm prompts; add --verbose for logs).
  • env: Secure secrets only (no plaintext in args).
  1. Save (Cmd/Ctrl + S). Restart Cursor fully.
  2. Test: AI chat > "@github List open issues in owner/repo." Response: JSON list with titles, assignees, labels.
  3. Advanced args: ["-y", "@modelcontextprotocol/server-github", "--org", "your-org"] for org tools.

Postgres example (local Docker DB):

{
  "mcpServers": {
    "postgres": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@mcp/server-postgres", "--connection-string", "postgresql://user:pass@localhost:5432/myapp?sslmode=disable"],
      "env": {}
    }
  }
}

Tools: get_schema, run_query, explain_query. Prompt: "@postgres run_query SELECT * FROM users LIMIT 5."

Docker variant: "command": "docker", "args": ["run", "-i", "--rm", "--network=host", "mcp/postgres:latest", "--dsn", "postgresql://..."]. Pull image first.

File system example: npx @mcp/server-filesystem --root /project/docs for doc fetching.

Tools auto-register on handshake. Monitor spawn with ps aux | grep npx.

Step 4: Configure a remote HTTP MCP server

HTTP suits deployed services or when stdio limits hit. First, run/expose the server.

Local HTTP example (dev server):

  1. Terminal: npx @mcp/server-redis --port 8000 --redis-url redis://localhost:6379.
  2. Config in mcp.json:
{
  "mcpServers": {
    "redis": {
      "type": "streamable-http",
      "url": "http://localhost:8000/mcp"
    }
  }
}

Tools: keys, get, scan. Test: "@redis keys 'user:*'".

Remote with auth (e.g., Natoma or self-hosted gateway):

  1. Deploy server (Vercel: vercel --prod; or Docker: docker run -p 8000:8000 mcp/redis --redis-url prod-url).
  2. Tunnel if local: ngrok http 8000 > copy https://abc.ngrok.io.
  3. Config:
{
  "mcpServers": {
    "prod-db": {
      "type": "streamable-http",
      "url": "https://your-gateway.example.com/mcp",
      "headers": {
        "Authorization": "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...",
        "X-API-Key": "sk-123"
      }
    }
  }
}

Headers tips: Bearer for JWTs; {"Content-Type": "application/json"} if needed. Base64 not required. Secure with HTTPS/TLS.

Proxy setup: Nginx conf:

server {
  listen 443 ssl http2;
  server_name mcp.yourdomain.com;
  ssl_certificate /path/to/cert.pem;
  location /mcp {
    proxy_pass http://localhost:8000/mcp;
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
  }
}

Reload: nginx -s reload. Multiple servers? Add entries; Cursor merges namespaces (e.g., @redis.keys vs @postgres.keys).

Step 5: Validate and restart Cursor

  1. Save mcp.json (lint first).
  2. Quit Cursor fully: Cmd/Ctrl + Q; macOS Activity Monitor > Quit all Cursor processes; Windows Task Manager > End task.
  3. Relaunch Cursor.
  4. Validate:
    • Command Palette (Cmd/Ctrl + Shift + P) > "Cursor: List MCP Tools" > See github.list_repos, postgres.get_schema.
    • AI chat: "@github get_repo_info owner/repo" > Expect stars, forks, languages.
  5. Full test workflow:
    • Composer mode: Select code > "Refactor using latest GitHub PR #45 diff and prod schema."
    • Chain: "List Postgres tables, then search GitHub for matching issues."
  6. Logs: Palette > "Developer: Show Logs" > MCP tab for handshakes (look for "Tool registered: get_issue").

Success indicators: Tools in autocomplete dropdowns, zero errors in logs. Pro tip: Use Cursor Composer for multi-file edits with MCP context, like auto-generating tests from DB schemas.

Common Mistakes

Avoid these pitfalls to save hours. Each includes fix commands.

  • JSON syntax errors: Missing commas, unescaped quotes invalidate. Fix: Paste into jsonlint.com, correct, save. Cursor ignores malformed silently (logs "Invalid config").
  • Incorrect PATH: npx/docker not found (error: ENOENT). macOS: export PATH="/usr/local/bin:$PATH" in ~/.zshrc, source ~/.zshrc. Windows: $env:PATH += ";C:\Program Files\Docker" in PowerShell profile. Restart terminal/Cursor.
  • Leaked credentials: Hardcoded tokens in JSON/args (git diff exposes). Always env block; OS vars: export GITHUB_TOKEN=ghp_xxx in ~/.bashrc.
  • Port conflicts: HTTP bind fails. macOS: lsof -i :8000 | xargs kill -9. Windows: netstat -ano | findstr :8000, taskkill /PID.
  • Version mismatches: Cursor v0.39 no MCP; Node 18 fails npx. Update Cursor (Help menu), Node (nvm use 20).
  • Firewall blocks: Remote HTTP 403/timeout. macOS: System Settings > Network > Firewall > Options > Allow Cursor. Windows: Firewall > Allow app.
  • Missing scopes: GitHub token lacks repo:read > 401 auth fail. Regenerate with full scopes.
  • Resource limits: 100+ tools overloads (OOM). Prune mcp.json.
  • WSL quirks: Paths /mnt/c/... confuse. Use Linux paths, install Node/Docker in WSL.

Run cursor --mcp-validate (future CLI) or jq for pre-checks.

Frequently Asked Questions

What happens if I exceed Cursor's tool limit?

Cursor limits active tools to about 40 across servers to preserve LLM context quality with models like GPT-4o or Claude 4.6. Exceeding causes degraded performance, ignored tools, or token overflows in responses. Disable via Cursor Settings > Features > MCP > Manage Tools; prioritize essentials like database queries and Git ops. Monitor usage in logs. This keeps agent responses accurate, fast, and focused during complex, multi-tool sessions involving chain calls.

Can I run multiple MCP servers at once?

Yes, declare multiple entries in mcp.json under mcpServers object, like "github": {...}, "postgres": {...}. Cursor loads them in parallel during startup, aggregates tools into a single namespace with prefixes (e.g., @github.get_issue, @postgres.run_query) to avoid conflicts. Resource usage scales linearly; cap at 10 servers. Monitor CPU/memory via Activity Monitor. Ideal for full-stack: DB + Git + Deploy tools together.

Is stdio safe for team environments?

No, stdio runs as a local subprocess per Cursor instance, bound to your user/session, inaccessible remotely. It exposes local resources like files/DBs directly. For teams, deploy streamable HTTP servers to a VPC, Kubernetes, or gateway (e.g., AWS ALB) with RBAC, OAuth2, and rate limits. Centralize auth via IdP like Okta. Audit logs track calls. Prevents credential sprawl and enables compliance.

Does restarting Cursor clear MCP cache?

Yes, full restart reloads mcp.json from disk, terminates all child processes (stdio) or reconnects sockets (HTTP), and performs fresh JSON-RPC handshakes. No persistent cache or state exists between launches. Update server code? Rebuild npm/Docker image, edit args if pinned version, restart IDE. Use for hot-reloads in dev. Avoid in long sessions; test changes in new window first.

Where can I find pre-built MCP servers?

Search npm for "@mcp/server-" or "@modelcontextprotocol/server- " (e.g., @mcp/server-github v1.2.3, @mcp/server-slack). GitHub org modelcontextprotocol/servers has 50+ repos with Dockerfiles. Official from Vercel (server-vercel), Supabase (server-supabase). Community hubs: awesome-mcp list. Clone: git clone https://github.com/modelcontextprotocol/servers, npm install, customize flags. Ecosystem grows with Claude 4.6 tool-calling and Grok 4.1 JSON parsing.

Can I control which files an MCP server sees?

Yes, most servers define scopes via config: --root-dir /safe/project for filesystem, --db-names myapp,staging for Postgres. Cursor adds approval prompts for destructive ops (e.g., DELETE). Review server source/manifest.json for exposed paths. Run in sandbox (Docker --read-only). Disable per-tool in Settings > MCP. Trust vetted packages; fork/audit open-source ones before prod use.

How do I debug connection failures?

Open Command Palette (Cmd/Ctrl + Shift + P), run "Developer: Show Logs," select "MCP Logs" tab. View real-time JSON-RPC: handshakes, "method: initialize," errors like "ENOENT: command not found" or "401 Unauthorized." Tail file: tail -f ~/.cursor/logs/mcp.log. Fixes: auth (refresh token), timeouts (increase --timeout flag), ports (netstat). Reproduce in minimal config.

Are MCP servers officially supported on Linux?

Yes, full support on Ubuntu 22.04+, Fedora 40+, Arch via flatpak/appimage/Snap installs. Config path: ~/.cursor/mcp.json or ~/.config/Cursor/mcp.json (check $XDG_CONFIG_HOME). Use distro PATH (systemd-resolve for Docker). Tested WSL2/Debian. NPM: sudo apt install nodejs npm. Report bugs: Cursor GitHub issues/mcp label. Same JSON schema cross-platform.

What models work best with MCP tools?

GPT-4o excels at parallel tool calls, Claude 4.6 at reasoning over schemas, Grok 4.1 at code-gen from Git diffs, Gemini 2.5 at large contexts (128k+ for 40 tools). Select in Cursor Settings > AI Models. Test: "@tool complex_query" iteratively. Fine-tune prompts: "Use @postgres first, then @github." Larger models handle chains without hallucination.

How do I update an MCP server package?

Edit mcp.json: change args to "@mcp/server-github@2.0.0" or "@latest" (npx fetches). For Docker: "image: mcp/postgres:1.1" > docker pull mcp/postgres:1.1. Restart Cursor. Check changelogs: npm view @mcp/server-github versions --json. Breaking changes: tool sigs (args renamed). Rollback: pin old version. Automate with scripts.

Can MCP servers access private repos?

Yes, pass GITHUB_TOKEN with private-repo scope in env/headers. For GitLab: @mcp/server-gitlab --token glpat-xxx. Self-hosted Git: --url ssh://git@yourserver/repo. Test: "@github list_repos" includes private. Revoke tokens post-test. Enterprise: VPN + custom server fork.

Sources

(Word count: 2500)

This post may contain affiliate links. If you purchase through these links, we may earn a commission at no extra cost to you.

Everything AI. One email. Every Monday.

New tools. Model launches. Plugins. Repos. Tactics. The moves the sharpest builders are making right now, before everyone else. 5-minute read.