TL;DR: DreamFactory 7.4+ includes a built-in MCP (Model Context Protocol) server that lets you connect any LLM—ChatGPT, Claude, Perplexity, or custom AI agents—to your enterprise databases through governed, role-based APIs. Setup takes minutes: create an MCP service in the admin console, copy the OAuth credentials, and point your AI application to the generated endpoint. All queries are enforced with RBAC, row-level security, and full audit logging—so each user only sees the data they're authorized to access.
MCP (Model Context Protocol) is becoming the standard way for AI apps to connect to tools and data—but enterprises still need governed access, least privilege, and auditability.
DreamFactory is a secure, self-hosted enterprise data access platform that provides governed API access to any data source, connecting enterprise applications and on-prem and hybrid LLMs with role-based access and identity passthrough.
With instant API generation (connect a database and generate a full REST API in under 60 seconds), built-in auth, RBAC and row-level security, audit logging, and API proxying—plus MCP now built directly into the DreamFactory platform—you get a production-ready path to connect LLMs to enterprise data, returning only what each user is allowed to see.
The most significant addition in DreamFactory 7.4+ is the new df-mcp-server integration, which implements the Model Context Protocol specification. MCP has emerged as a standard interface for connecting AI applications—including large language models, AI agents, and copilot systems—to external data sources and tools.
Organizations running DreamFactory can now expose their existing REST APIs to AI applications without writing custom integration code. This capability enables several high-value use cases:
Conversational data access: Allow AI assistants to query data from databases, retrieve records, and perform CRUD operations through natural language interfaces—enabling true "chat with your database" experiences.
AI-powered automation: Enable AI agents to interact with enterprise systems through DreamFactory's unified API layer, supporting agentic workflows across your data stack.
Custom AI tooling: Build internal AI applications that leverage existing database connections and business logic without writing new integration code.
Secure AI integration: Maintain DreamFactory's role-based access controls when AI systems interact with sensitive data, ensuring enterprise AI governance at every step.
The MCP Server integration includes support for custom login pages, allowing organizations to maintain consistent authentication experiences across human and AI-driven access patterns.
Customers want to roll out "chat with your data" and agentic automation quickly—but connecting LLMs to real systems raises security and governance stakes. DreamFactory + MCP helps teams move fast without over-exposing sensitive data.
Common risks when LLMs touch enterprise data
What DreamFactory + MCP enables
Before you start, make sure you have:
DreamFactory makes the backend setup simple by automatically generating the necessary OAuth credentials and endpoints.
my_mcp_service). This becomes part of your API URL.Scroll down to the Advanced options section to link your data:
mcp_endpoint field. This URL is the handshake address your LLM client will use to connect to your MCP service.Now that your MCP service is live, connect it to ChatGPT so you can start chatting with your data. Additional instructions will be provided for additional LLMs in the near future. In the meantime, contact DreamFactory to get set up with any LLM.
To use custom MCP servers, ensure Developer Mode is enabled in your ChatGPT settings.
mcp_endpoint URL you copied from DreamFactory.Once connected, try a quick natural-language prompt such as:
Users table."ChatGPT will use MCP to fetch schema and data and present results directly in the chat.
By bridging your database and any LLM client via DreamFactory's MCP service, you turn a static data source into an interactive, AI-accessible asset. Whether you're enabling "chat with your data," building agentic workflows, or connecting an LLM database connector to your enterprise systems, this setup provides a secure, scalable foundation—so AI systems only retrieve what users are authorized to see.
Ready to start? See the full MCP service creation documentation for detailed configuration options. Or contact DreamFactory for assistance.
Q1: What is MCP (Model Context Protocol) and why does it matter for enterprise AI?
MCP is a standard way for AI applications to connect to tools and data sources. For enterprises, MCP matters because it reduces custom integration work and makes it easier to standardize how LLM chat experiences and AI agents access internal systems—when paired with governance controls like RBAC, identity passthrough, and audit logging.
Q2: Does DreamFactory's built-in MCP work for both "chat with your data" and AI agents?
Yes. Both use cases rely on the same core capability: secure tool/data access. DreamFactory's MCP service provides a standardized connection for AI clients, while DreamFactory enforces authentication, RBAC, row-level security, and auditing so responses include only authorized data.
Q3: How does DreamFactory ensure an LLM only returns data a user is allowed to see?
DreamFactory uses identity passthrough (where applicable) plus authentication and role-based access control, and can enforce row-level security policies. This ensures queries are evaluated under the correct user context and restricted to the permitted tables, fields, and records.
Q4: What do I need to connect ChatGPT to DreamFactory using MCP?
You need DreamFactory v7.4.1+ with an MCP Server service configured, an existing database service to expose, and access to ChatGPT Developer Mode. DreamFactory generates OAuth credentials and provides an mcp_endpoint URL that you add in ChatGPT when creating the app connection.
Q5: What LLMs and AI clients are compatible with DreamFactory's MCP server?
DreamFactory's MCP server works with any client that supports the Model Context Protocol specification. This includes ChatGPT, Claude Desktop, Perplexity, LangChain, LlamaIndex, and custom AI agent frameworks. Because MCP is an open standard, any compliant client can connect using the OAuth credentials and endpoint URL that DreamFactory generates.
Q6: How long does it take to set up MCP in DreamFactory?
Most teams can configure their first MCP service in under 10 minutes. The process involves creating the MCP service in the admin console, selecting a database, and copying the auto-generated OAuth credentials into your LLM client. No custom integration code is required.
Q7: Can AI agents write data back to the database through DreamFactory?
Yes. DreamFactory's MCP integration supports full CRUD operations (Create, Read, Update, Delete), so AI agents can not only query data but also insert, update, and delete records—all governed by the same RBAC policies and audit logging that apply to read operations.
Q8: How does DreamFactory's built-in MCP compare to building a custom MCP server?
Building a custom MCP server requires writing and maintaining integration code, implementing authentication, managing access controls, and building audit logging from scratch. DreamFactory provides all of this out of the box: instant API generation across 20+ database types, built-in OAuth, RBAC with row-level security, and full audit trails. Teams typically save weeks of development time by using DreamFactory's built-in MCP instead of building a custom solution.
Q9: Does DreamFactory MCP support streaming responses?
DreamFactory's MCP server follows the MCP specification for transport, which supports Server-Sent Events (SSE) for streaming. This means LLM clients that support streaming can receive data incrementally rather than waiting for a complete response.