Vectara MCP Server
Vectara MCP servers enable AI models to interact with Vectara's RAG platform, providing capabilities for fast and reliable Retrieval-Augmented Generation, semantic search, and hallucination correction.
Overview
The Vectara MCP Server provides agentic AI applications with access to fast and reliable Retrieval-Augmented Generation (RAG) capabilities, powered by Vectara's Trusted RAG platform through the Model Context Protocol (MCP). It's compatible with any MCP client, including Claude Desktop.
Official Server:
Developed and maintained by Vectara
Key Features
Fast \u0026 Reliable RAG
Access to fast and reliable Retrieval-Augmented Generation with reduced hallucinations.
Semantic Search
Perform advanced semantic searches without generating responses.
Hallucination Correction
Identify and correct hallucinations in generated text using Vectara's VHC API.
Secure \u0026 Scalable
Built-in authentication, HTTPS readiness, rate limiting, and CORS protection.
Available Tools
Quick Reference
| Tool | Purpose | Category |
|---|---|---|
setup_vectara_api_key | Configure and validate Vectara API key | API Key Management |
clear_vectara_api_key | Clear stored API key | API Key Management |
ask_vectara | Run RAG query with generated response | Query |
search_vectara | Run semantic search query | Query |
correct_hallucinations | Identify and correct hallucinations | Analysis |
Detailed Usage
setup_vectara_api_key▶
Configure and validate your Vectara API key for the session (one-time setup).
use_mcp_tool({
server_name: "vectara",
tool_name: "setup_vectara_api_key",
arguments: {
api_key: "your_vectara_api_key"
}
});
Returns success confirmation with masked API key or validation error.
clear_vectara_api_key▶
Clear the stored API key from server memory.
use_mcp_tool({
server_name: "vectara",
tool_name: "clear_vectara_api_key",
arguments: {}
});
Returns confirmation message.
ask_vectara▶
Run a RAG query using Vectara, returning search results with a generated response.
use_mcp_tool({
server_name: "vectara",
tool_name: "ask_vectara",
arguments: {
query: "What is Vectara?",
corpus_keys: ["my-corpus"],
n_sentences_before: 2,
n_sentences_after: 2,
lexical_interpolation: 0.005,
max_used_search_results: 10,
generation_preset_name: "vectara-summary-table-md-query-ext-jan-2025-gpt-4o",
response_language: "eng"
}
});
Returns the response from Vectara, including the generated answer and the search results.
search_vectara▶
Run a semantic search query using Vectara, without generation.
use_mcp_tool({
server_name: "vectara",
tool_name: "search_vectara",
arguments: {
query: "What is Vectara?",
corpus_keys: ["my-corpus"],
n_sentences_before: 2,
n_sentences_after: 2,
lexical_interpolation: 0.005
}
});
Returns the response from Vectara, including the matching search results.
correct_hallucinations▶
Identify and correct hallucinations in generated text using Vectara's VHC (Vectara Hallucination Correction) API.
use_mcp_tool({
server_name: "vectara",
tool_name: "correct_hallucinations",
arguments: {
generated_text: "The generated text to analyze for hallucinations.",
documents: ["document1", "document2"]
}
});
Returns the analysis of the generated text with hallucination corrections.
Installation
{
"mcpServers": {
"vectara": {
"command": "python",
"args": [
"-m",
"vectara_mcp"
],
"env": {
"VECTARA_API_KEY": "your_vectara_api_key"
}
}
}
}
Custom Configuration:
Replace your_vectara_api_key with your actual Vectara API key. Additional environment variables like VECTARA_AUTHORIZED_TOKENS, VECTARA_ALLOWED_ORIGINS, VECTARA_TRANSPORT, and VECTARA_AUTH_REQUIRED can be configured.
Common Use Cases
1. Enhancing AI applications with reliable data retrieval
use_mcp_tool({
server_name: "vectara",
tool_name: "ask_vectara",
arguments: {
query: "What are the benefits of using Vectara for RAG?",
corpus_keys: ["product-documentation"]
}
});
2. Running complex queries to extract information from various data sources
use_mcp_tool({
server_name: "vectara",
tool_name: "search_vectara",
arguments: {
query: "Show me all documents related to 'AI ethics' from the last quarter.",
corpus_keys: ["research-papers", "news-articles"]
}
});
3. Integrating with other AI systems for improved performance
// Example: A chatbot uses Vectara to answer user questions
const userQuery = "How does Vectara handle data privacy?";
const vectaraResponse = use_mcp_tool({
server_name: "vectara",
tool_name: "ask_vectara",
arguments: {
query: userQuery,
corpus_keys: ["privacy-policy", "legal-documents"]
}
});
// Further processing of vectaraResponse by the chatbot
4. Hallucination correction
const generatedText = "Vectara is a company that sells shoes.";
const sourceDocuments = ["Vectara is a company that provides a RAG platform.", "Vectara focuses on AI and search."];
use_mcp_tool({
server_name: "vectara",
tool_name: "correct_hallucinations",
arguments: {
generated_text: generatedText,
documents: sourceDocuments
}
});
Connection String Format
The Vectara MCP server does not use a traditional connection string. Instead, it relies on environment variables for configuration and API keys for authentication. The server runs as an HTTP or SSE service.
- API Key: Set
VECTARA_API_KEYenvironment variable. - Authentication: Bearer tokens are used for HTTP/SSE transport.
- Host/Port: Configured via command-line arguments (
--host,--port) or defaults tohttp://127.0.0.1:8000.
Sources
Related Articles
Integration and Automation MCP Servers
The Integration & Automation category provides integration with automation tools and workflow systems, enabling seamless connectivity and process automation across different platforms and services.
MCP Architecture Overview
MCP (Model Context Protocol) features a distributed architecture enabling AI applications to communicate seamlessly with multiple data sources and tools through standardized interfaces.
Moralis MCP Server
Moralis MCP servers enable AI models to query on-chain data — wallet activity, token metrics, NFTs, and dapp usage — via the Moralis Web3 APIs.