back arrow Blog
How to Connect LLM Chat and AI Agents to Enterprise Data Using Built-In MCP in DreamFactory

How to Connect LLM Chat and AI Agents to Enterprise Data Using Built-In MCP in DreamFactory

RECOMMENDED ARTICLES

 

Secure AI Data Access for ChatGPT, Claude, Perplexity, and Any MCP-Compatible LLM

TL;DR: DreamFactory 7.4+ includes a built-in MCP (Model Context Protocol) server that lets you connect any LLM—ChatGPT, Claude, Perplexity, or custom AI agents—to your enterprise databases through governed, role-based APIs. Setup takes minutes: create an MCP service in the admin console, copy the OAuth credentials, and point your AI application to the generated endpoint. All queries are enforced with RBAC, row-level security, and full audit logging—so each user only sees the data they're authorized to access.


MCP (Model Context Protocol) is becoming the standard way for AI apps to connect to tools and data—but enterprises still need governed access, least privilege, and auditability.

DreamFactory is a secure, self-hosted enterprise data access platform that provides governed API access to any data source, connecting enterprise applications and on-prem and hybrid LLMs with role-based access and identity passthrough.

With instant API generation (connect a database and generate a full REST API in under 60 seconds), built-in auth, RBAC and row-level security, audit logging, and API proxying—plus MCP now built directly into the DreamFactory platform—you get a production-ready path to connect LLMs to enterprise data, returning only what each user is allowed to see.

New Feature: MCP Server Integration for AI Applications

The most significant addition in DreamFactory 7.4+ is the new df-mcp-server integration, which implements the Model Context Protocol specification. MCP has emerged as a standard interface for connecting AI applications—including large language models, AI agents, and copilot systems—to external data sources and tools.

What You Get for Enterprise Teams

Organizations running DreamFactory can now expose their existing REST APIs to AI applications without writing custom integration code. This capability enables several high-value use cases:

Conversational data access: Allow AI assistants to query data from databases, retrieve records, and perform CRUD operations through natural language interfaces—enabling true "chat with your database" experiences.

AI-powered automation: Enable AI agents to interact with enterprise systems through DreamFactory's unified API layer, supporting agentic workflows across your data stack.

Custom AI tooling: Build internal AI applications that leverage existing database connections and business logic without writing new integration code.

Secure AI integration: Maintain DreamFactory's role-based access controls when AI systems interact with sensitive data, ensuring enterprise AI governance at every step.

The MCP Server integration includes support for custom login pages, allowing organizations to maintain consistent authentication experiences across human and AI-driven access patterns.

Why This Matters: AI Innovation with Enterprise-Grade Data Security

Customers want to roll out "chat with your data" and agentic automation quickly—but connecting LLMs to real systems raises security and governance stakes. DreamFactory + MCP helps teams move fast without over-exposing sensitive data.

Common risks when LLMs touch enterprise data

  • Over-broad access that exposes sensitive records
  • Missing identity context (no per-user enforcement)
  • Prompt/SQL injection leading to unintended data retrieval
  • Lack of audit trails for compliance and incident response
  • Inconsistent access patterns across multiple data sources and APIs

What DreamFactory + MCP enables

  • Standard MCP integration backed by a governed enterprise data/API layer
  • Identity passthrough with least-privilege RBAC and row-level security
  • Instant, consistent APIs across 20+ supported databases and systems (plus API proxying)
  • Auditing and logging by default to prove who accessed what
  • "Only return what the user is allowed to see" enforcement for production AI deployments

Prerequisites

Before you start, make sure you have:

  • DreamFactory installed and running (v7.4.1 or later) with access to the Admin Console
  • An LLM client that supports MCP — such as ChatGPT (Developer Mode), Claude Desktop, Perplexity, or any MCP-compatible AI agent framework (e.g., LangChain, LlamaIndex)
  • A database to connect (any supported database is fine—use a dev/test dataset if you're getting started)

Phase 1: Creating Your MCP Service in DreamFactory

DreamFactory makes the backend setup simple by automatically generating the necessary OAuth credentials and endpoints.

Step 1: Initialize the Service

  1. Log in to your DreamFactory instance as an administrator.
  2. Navigate to the AI tab on the top menu.
  3. Click the + (Plus) button to create a new connection.
  4. Select MCP Server from the available service types.

Step 2: Basic Configuration

  • API Name: Enter a lowercase, alphanumeric name (e.g., my_mcp_service). This becomes part of your API URL.
  • Label & Description: Provide a friendly name and description for reference in the admin console.

Step 3: Advanced Options

Scroll down to the Advanced options section to link your data:

  • Database Service: Select the existing DreamFactory database service you want to expose.
  • OAuth Credentials: DreamFactory will automatically generate an OAuth Client ID and OAuth Client Secret. Save these—you'll use them to connect your LLM client.

Step 4: Get Your MCP Endpoint

  1. Go to the API Docs tab.
  2. Find your new MCP service and run a GET request to view the service details.
  3. Locate the mcp_endpoint field. This URL is the handshake address your LLM client will use to connect to your MCP service.

Phase 2: Connecting an LLM Client to Your MCP Service

Now that your MCP service is live, connect it to ChatGPT so you can start chatting with your data. Additional instructions will be provided for additional LLMs in the near future. In the meantime, contact DreamFactory to get set up with any LLM.

Step 1: Enable Developer Mode

To use custom MCP servers, ensure Developer Mode is enabled in your ChatGPT settings.

Step 2: Create the AI App

  1. In ChatGPT, go to Settings → Apps → Advanced Settings.
  2. Click Create app.
  3. Fill in the application details:
    • Endpoint: Paste the mcp_endpoint URL you copied from DreamFactory.
    • Authentication: Set to OAuth.
    • Client ID & Secret: Paste the credentials DreamFactory generated in Phase 1.

Step 3: Connect and Authenticate

  1. Open a new chat in ChatGPT.
  2. Use the Attach (or Plugins/Apps) menu to select your newly created MCP application.
  3. When prompted, log in via your DreamFactory instance to authorize the connection.

Step 4: Run a Test Query

Once connected, try a quick natural-language prompt such as:

  • "Show me the tables in my database."
  • "List the first 10 records from the Users table."

ChatGPT will use MCP to fetch schema and data and present results directly in the chat.


Conclusion

By bridging your database and any LLM client via DreamFactory's MCP service, you turn a static data source into an interactive, AI-accessible asset. Whether you're enabling "chat with your data," building agentic workflows, or connecting an LLM database connector to your enterprise systems, this setup provides a secure, scalable foundation—so AI systems only retrieve what users are authorized to see.

Ready to start? See the full MCP service creation documentation for detailed configuration options.  Or contact DreamFactory for assistance.


FAQ

Q1: What is MCP (Model Context Protocol) and why does it matter for enterprise AI?

MCP is a standard way for AI applications to connect to tools and data sources. For enterprises, MCP matters because it reduces custom integration work and makes it easier to standardize how LLM chat experiences and AI agents access internal systems—when paired with governance controls like RBAC, identity passthrough, and audit logging.

Q2: Does DreamFactory's built-in MCP work for both "chat with your data" and AI agents?

Yes. Both use cases rely on the same core capability: secure tool/data access. DreamFactory's MCP service provides a standardized connection for AI clients, while DreamFactory enforces authentication, RBAC, row-level security, and auditing so responses include only authorized data.

Q3: How does DreamFactory ensure an LLM only returns data a user is allowed to see?

DreamFactory uses identity passthrough (where applicable) plus authentication and role-based access control, and can enforce row-level security policies. This ensures queries are evaluated under the correct user context and restricted to the permitted tables, fields, and records.

Q4: What do I need to connect ChatGPT to DreamFactory using MCP?

You need DreamFactory v7.4.1+ with an MCP Server service configured, an existing database service to expose, and access to ChatGPT Developer Mode. DreamFactory generates OAuth credentials and provides an mcp_endpoint URL that you add in ChatGPT when creating the app connection.

Q5: What LLMs and AI clients are compatible with DreamFactory's MCP server?

DreamFactory's MCP server works with any client that supports the Model Context Protocol specification. This includes ChatGPT, Claude Desktop, Perplexity, LangChain, LlamaIndex, and custom AI agent frameworks. Because MCP is an open standard, any compliant client can connect using the OAuth credentials and endpoint URL that DreamFactory generates.

Q6: How long does it take to set up MCP in DreamFactory?

Most teams can configure their first MCP service in under 10 minutes. The process involves creating the MCP service in the admin console, selecting a database, and copying the auto-generated OAuth credentials into your LLM client. No custom integration code is required.

Q7: Can AI agents write data back to the database through DreamFactory?

Yes. DreamFactory's MCP integration supports full CRUD operations (Create, Read, Update, Delete), so AI agents can not only query data but also insert, update, and delete records—all governed by the same RBAC policies and audit logging that apply to read operations.

Q8: How does DreamFactory's built-in MCP compare to building a custom MCP server?

Building a custom MCP server requires writing and maintaining integration code, implementing authentication, managing access controls, and building audit logging from scratch. DreamFactory provides all of this out of the box: instant API generation across 20+ database types, built-in OAuth, RBAC with row-level security, and full audit trails. Teams typically save weeks of development time by using DreamFactory's built-in MCP instead of building a custom solution.

Q9: Does DreamFactory MCP support streaming responses?

DreamFactory's MCP server follows the MCP specification for transport, which supports Server-Sent Events (SSE) for streaming. This means LLM clients that support streaming can receive data incrementally rather than waiting for a complete response.