Axiom Refract + Cursor
Connect Cursor's AI to your architectural record for structurally aware code assistance
How It Works
Open MCP Settings
In Cursor, navigate to Settings > MCP Servers (or edit your .cursor/mcp.json configuration file directly).
Add Axiom Server
Add the Axiom Refract MCP server endpoint using the Streamable HTTP configuration. Save the configuration.
Restart Cursor
Restart Cursor to initialize the MCP connection. The Axiom tools will appear in Cursor's tool list.
Use in Chat
Ask Cursor's AI assistant about your codebase architecture. The assistant will use Axiom's tools to provide answers grounded in structural analysis data.
What You Can Do
Contextual Architecture Queries
Ask Cursor about the architecture while editing code. "What is the blast radius of this file?" "Is this a single point of failure?" Cursor queries Axiom and provides answers inline.
Refactoring Guidance
Before refactoring, Cursor can query Axiom's migration plan and suggest the highest-impact improvements based on structural data.
Dependency Exploration
Cursor can traverse the dependency graph through Axiom, showing you what depends on the file you are editing and what it depends on.
Dead Code Identification
While reviewing code, Cursor can check whether a function or file is dead code — never called from any active path in the codebase.
Setup
Open Cursor and navigate to Settings > MCP Servers. Click "Add Server" and enter the Axiom Refract endpoint URL. Alternatively, create or edit .cursor/mcp.json in your project root with the configuration shown above. Restart Cursor to apply the configuration. The 16 Axiom tools will be available to Cursor's AI assistant.
{
"mcpServers": {
"axiom-refract": {
"type": "http",
"url": "https://axiomrefract.com/mcp"
}
}
}Why This Matters
Cursor is a popular AI-native code editor used by thousands of developers. Connecting it to Axiom Refract via MCP transforms Cursor from a code-aware assistant into an architecture-aware assistant — able to answer structural questions with verified data rather than inference.