Skip to content

Introduction

Data MCP (Data Model Context Protocol) is a Python backend server built with FastAPI that enables you to connect to any data source via query, API, or code and expose data operations as MCP (Model Context Protocol) tools for AI assistants. This allows AI models to interact with your data through natural language, making data access and analysis more intuitive and accessible.

What is Data MCP?

Data MCP serves as a bridge between AI assistants and your data sources. It provides:

  • Data Source Connectivity: Connect to multiple data source types (PostgreSQL, MySQL, SQLite, Databricks, APIs, and more)
  • Query Management: Store and manage parameterized queries with Jinja template support
  • MCP Integration: Expose data operations as tools that AI assistants can use
  • Web UI: Simple interface for managing datasources and tools

Architecture Overview

┌─────────────────┐    ┌──────────────┐    ┌─────────────────┐
│   AI Assistant  │    │   Data MCP   │    │   Data Sources  │
│   (Claude, etc) │◄──►│   Server     │◄──►│   (Data bases,  │
│                 │    │              │    │    APIs, etc)   │
└─────────────────┘    └──────────────┘    └─────────────────┘


                       ┌──────────────┐
                       │   Web UI     │
                       │   (Optional) │
                       └──────────────┘

Key Features

Multi-Data Source Support

Currently supports the following data source systems:

  • PostgreSQL - Full support with async operations
  • MySQL - Complete MySQL/MariaDB support
  • SQLite - Lightweight local data storage support
  • Databricks - Cloud data warehouse integration
  • Coming Soon - Snowflake, Redshift, and more

MCP Tool Integration

Transform your data queries and operations into AI-accessible tools:

  • Create named queries with parameters
  • Expose complex data operations as simple tools
  • Support for Jinja templating in queries
  • Automatic parameter validation and type checking

Web Interface

  • Intuitive UI for managing datasources
  • Tool creation and management interface
  • Real-time query testing and validation
  • Connection testing and monitoring

Getting Started

The typical workflow involves:

  1. Installation - Set up the Data MCP server
  2. Initial Access - Login with default admin credentials (admin/dochangethispassword)
  3. Security Setup - Change the default admin password
  4. Configuration - Configure your data source connections
  5. Tool Creation - Create MCP tools from your queries and operations
  6. MCP Integration - Connect AI assistants to your tools

Ready to get started? Check out the Installation Guide to begin your Data MCP journey!