Surfboard Payments Documentation MCP
A Model Context Protocol (MCP) server that provides AI assistants with direct access to Surfboard Payments' documentation.
Overview
This MCP server acts as a bridge between AI assistants and Surfboard Payments' documentation, enabling AI systems to access current, accurate documentation on-demand. This ensures developers receive precise guidance when integrating with Surfboard Payments APIs.
Features
- Documentation Fetching: Retrieve raw documentation files (Markdown, JSON) from Surfboard's servers
- Documentation Index: Access the central documentation index (
llms.txt
)
- Basic Search: Find relevant documentation paths based on simple keyword queries
- Documentation Caching: Store documentation for 3-4 days, reducing network overhead and improving response times
- Multi-runtime Support: Run using Node.js, Bun
Installation
Local Installation
npm install @surfai/docs-mcp
Global Installation
npm install -g @surfai/docs-mcp
Usage
When Installed Locally
Running with Node.js
npm run start:node
Running with Bun
npm run start:bun
When Installed Globally
node-surfboard-documentation-ai
OR
bun-surfboard-documentation-ai
Available Tools
The MCP server provides the following tools:
1. surfboard_fetch_documentation_index
Fetches the Surfboard documentation index.
Parameters:
bypass_cache
(optional): Whether to bypass the cache (default: false)
Example:
{
"name": "surfboard_fetch_documentation_index",
"arguments": {
"bypass_cache": false
}
}
2. surfboard_fetch_documentation
Fetches a specific Surfboard documentation file.
Parameters:
path
(required): Path to the documentation file
bypass_cache
(optional): Whether to bypass the cache (default: false)
Example:
{
"name": "surfboard_fetch_documentation",
"arguments": {
"path": "/api/orders.json",
"bypass_cache": false
}
}
3. surfboard_search_documentation
Searches Surfboard documentation.
Parameters:
query
(required): Search query
max_results
(optional): Maximum number of results to return (default: 10)
include_snippets
(optional): Whether to include snippets in the results (default: true)
bypass_cache
(optional): Whether to bypass the cache (default: false)
Example:
{
"name": "surfboard_search_documentation",
"arguments": {
"query": "process payment",
"max_results": 5,
"include_snippets": true,
"bypass_cache": false
}
}
4. surfboard_clear_cache
Clears the documentation cache.
Parameters:
path
(optional): Path to clear from cache (if not provided, clears the entire cache)
Example:
{
"name": "surfboard_clear_cache",
"arguments": {
"path": "/api/orders.json"
}
}
MCP Integration
This MCP server is designed to be used with AI assistants that support the Model Context Protocol. It provides tools for accessing Surfboard Payments documentation directly from the AI assistant.
Configuration
When Installed Locally
When registering this MCP server with your AI assistant platform and using a local installation:
{
"command": "node",
"args": ["node_modules/@surfai/docs-mcp/node.index.js"],
"env": {}
}
Or for Bun:
{
"command": "bun",
"args": ["node_modules/@surfai/docs-mcp/bun.index.js"],
"env": {}
}
When Installed Globally
When registering this MCP server with your AI assistant platform and using a global installation:
{
"command": "node-surfboard-documentation-ai",
"args": [],
"env": {},
"autoApprove": []
}
Or for Bun:
{
"command": "bun-surfboard-documentation-ai",
"args": [],
"env": {},
"autoApprove": []
}
If you're having troubles using the above command, you can try running the following command instead:
~ npm root -g
/Users/{{USER}}/.nvm/versions/node/{{VERSION}}/lib/node_modules
Then, you can use the mcp server in the following way:
{
"command": "node",
"args": [
"/Users/{{USER}}/.nvm/versions/node/{{VERSION}}/lib/node_modules/@surfai/docs-mcp/node.index.js"
],
"env": {},
"disabled": false,
"autoApprove": []
}
License
MIT