Wire Neruva into the tool you already use.
One MCP server. Every modern coding agent. Memory and Substrate, behind a single API key. Pinecone-compatible REST when you need it.
claude mcp add neruva \ --env NERUVA_API_KEY=$NERUVA_API_KEY \ -- npx -y @neruva/mcpClaude Code docs →
# ~/.cursor/mcp.json
{
"mcpServers": {
"neruva": {
"command": "npx",
"args": ["-y", "@neruva/mcp"],
"env": { "NERUVA_API_KEY": "..." }
}
}
}Cursor docs →# ~/.codex/config.toml
[mcp_servers.neruva]
command = "npx"
args = ["-y", "@neruva/mcp"]
env = { NERUVA_API_KEY = "..." }Codex docs →# ~/.gemini/settings.json
{
"mcpServers": {
"neruva": {
"command": "npx",
"args": ["-y", "@neruva/mcp"],
"env": { "NERUVA_API_KEY": "..." }
}
}
}Gemini CLI docs →# stdio transport, JSON-RPC 2.0 npx -y @neruva/mcp # 26 tools exposed (4 layers): # Memory -- create_index, list_indexes, describe_index, stats, # embed, upsert, upsert_text, query, query_text, fetch, # update, replace_text, delete, forget_where, # export, import, bind_role, read_roles # KG -- kg.add_fact, kg.query, kg.delete_fact, # kg.export, kg.import # Causal -- causal.add_worlds, causal.query, # causal.export, causal.import # Analogy -- analogy # Bonus -- query/query_text accept tags=[...] for AND-filterAny MCP host docs →
pip install neruva-mcp neruva-mcp # stdio MCP server # or run inline: python -m neruva_mcpPython (pip) docs →
The MCP above lets your agent read from Records, KG, Causal, and Analogy. Auto-record is how the namespace fills up in the first place -- without you writing any logging code. Two on-ramps, both fire-and-forget, both write typed records (kind, tags, ts, meta) into the same namespace your agent queries.
pip install neruva-record anthropic
export NERUVA_API_KEY=nv_...
import anthropic
from neruva_record import auto_record
client = auto_record(
anthropic.Anthropic(),
namespace="main",
ttl_days=30, # optional auto-expiry
)
# Drop-in -- behaves identically to bare Anthropic.
client.messages.create(
model="claude-opus-4-7", max_tokens=200,
messages=[{"role":"user","content":"Hi!"}],
)
# A typed kind="llm_turn" record is upserted as a
# side-effect. Recall it later via records_query
# with kind=["llm_turn"] or tagsAny=["anthropic"].pip install neruva-record neruva-record-install # - merges 10 hooks into ~/.claude/settings.json # (UserPromptSubmit, PostToolUse, Stop, SessionStart, # SubagentStart/Stop, TaskCreated/Completed, ...) # - registers @neruva/mcp via 'claude mcp add' # - sets NERUVA_API_KEY + NERUVA_AUTO_RECORD env # Restart Claude Code. From now on every Bash, Read, # Edit, Write, WebFetch, MCP call, prompt, response, # subagent and task lands as a typed record (kind = # tool_call / user_prompt / assistant_turn / ...). # Async fire-and-forget; never slows the agent. neruva-record-install --uninstall # to remove cleanly
Every record carries first-class kind (user_prompt, tool_call, assistant_turn, subagent_start, ...), tags (array, queryable), and ts (epoch ms). Calls into Neruva's own MCP are auto-skipped to prevent recording the recording. TTL is per-namespace via the :N suffix (NERUVA_AUTO_RECORD=main:30).
# Use the standard pinecone-client, just swap the host.
pip install pinecone-client
from pinecone import Pinecone
pc = Pinecone(
api_key="<NERUVA_API_KEY>",
host="https://api.neruva.io",
)
idx = pc.Index("agents")
idx.upsert(vectors=[...], namespace="coder")
idx.query(vector=qvec, top_k=5, namespace="coder")curl -X POST https://api.neruva.io/v1/indexes/agents/namespaces/coder/upsert \
-H "Api-Key: $NERUVA_API_KEY" \
-H "Content-Type: application/json" \
-d '{"vectors":[{"id":"n1","values":[0.1,0.2,...]}]}'curl -X POST https://api.neruva.io/v1/hd/analogy \
-H "Api-Key: $NERUVA_API_KEY" \
-H "Content-Type: application/json" \
-d '{"n_feat":6,"a":1,"b":2,"c":3}'
# returns {candidate, cosine, runner_up, confidence}Dedicated TypeScript and Python SDKs ship with the next release. For now, any HTTP client works.
- One server, many clients. @neruva/mcp is a thin MCP stdio server. It exposes the same tool set to every host, authenticates with your Neruva API key, and forwards calls to api.neruva.io.
- No keys leave your machine. The MCP server reads NERUVA_API_KEY from the local environment. Vectors and queries travel directly to the Neruva API over TLS.
- Same wallet, every surface. Ops from MCP and raw REST all meter against the single wallet you can see in the dashboard. One ledger, sub-cent precision, exportable.
- Pinecone-compatible. Already wired to pinecone-client? Point the host argument at https://api.neruva.io and your existing upsert/query/delete code works unchanged.
One key. Every coding agent.
Sign up. Drop the line into your MCP client. Your agent has persistent memory and HD-native reasoning by lunch.