B

Confluent MCP Server

Official Confluent MCP server for Kafka, Flink, Schema Registry, Connectors, Tableflow, and billing management

Overall Score71/100

Score Breakdown

Server Info

Package
@confluentinc/mcp-confluent
Registry
npm
Maintainer
Community
Category
Analytics & Data
Tags
streamingmessagingevents
Last Scanned
7 Apr 2026

Findings

7 issues

Authentication & Identity

MEDIUMHTTP/SSE transport supports per-request credentials

Supports stdio, SSE, and Streamable HTTP transports via Fastify. HTTP/SSE transports support API key auth (MCP_API_KEY, min 32 chars) with DNS rebinding protection (MCP_ALLOWED_HOSTS). Auth can be disabled for dev via MCP_AUTH_DISABLED. No MCP OAuth. Tools are auto-enabled/disabled based on which API key env vars are present. Each handler declares required env vars via getRequiredEnvVars().

Remediation

Implement the MCP OAuth spec so users authenticate directly without platform mediation.

Tool Schema Quality

MEDIUMOnly 13 of 50 schemas have parameter constraints

Most schemas lack maxLength, enum, or pattern constraints on string parameters.

Remediation

Add constraints to string parameters, especially on write operations.

CRITICALDangerous execution surface: create-flink-statement accepts arbitrary Flink SQL statements (max 131072 chars) which can create/alter/drop tables and run arbitrary queries

Tool allows raw code/query execution which could be exploited via prompt injection.

Remediation

Use parameterized queries or validated command sets.

LLM Safety

MEDIUM3 tool descriptions are too vague

Short or generic descriptions make tool selection unreliable.

Remediation

Expand descriptions with specific actions, data types, and side effects.

HIGHTool descriptions contain instructional language

Descriptions include directives that could influence LLM behavior beyond tool selection.

Remediation

Remove instructional language. Descriptions should be purely factual.

Data Exposure

MEDIUM5 list operations lack pagination

Flink statement listing has good pagination (pageSize max 100, pageToken). Environments and billing have pagination. Tableflow lists mention pagination in descriptions. However, list-topics returns all topics with no pagination. consume-messages has configurable maxMessages (default 10) and timeoutMs (default 10000ms) which limits data exposure. No field selection on any tool.

Remediation

Add limit/offset or cursor-based pagination.

LOWNo field selection on responses

Responses return full records rather than projected fields.

Remediation

Implement field selection to return only relevant fields.

Tools

50 total
NameDescriptionRisk
list-topicsList all topics in the Kafka cluster.read
create-topicsCreate one or more Kafka topics.write
delete-topicsDelete the topic with the given names.admin
produce-messageProduce records to a Kafka topic. Supports Confluent Schema Registry serialization (AVRO, JSON, PROTOBUF) for both key and value.write
consume-messagesConsumes messages from one or more Kafka topics. Supports automatic deserialization of Schema Registry encoded messages (AVRO, JSON, PROTOBUF).read
alter-topic-configAlter topic configuration in Confluent Cloud.write
get-topic-configRetrieve configuration details for a specific Kafka topic.read
list-flink-statementsRetrieve a sorted, filtered, paginated list of all statements.read
create-flink-statementMake a request to create a statement.write
read-flink-statementMake a request to read a statement and its resultsread
delete-flink-statementsMake a request to delete a statement.admin
get-flink-statement-exceptionsRetrieve the 10 most recent exceptions for a Flink SQL statement. Useful for diagnosing failed or failing statements.read
list-flink-catalogsList all catalogs available in the Flink environment via INFORMATION_SCHEMA.CATALOGS.read
list-flink-databasesList all databases (schemas) in a Flink catalog via INFORMATION_SCHEMA.SCHEMATA. Returns catalog and database names.read
list-flink-tablesList all tables in a Flink database via INFORMATION_SCHEMA.TABLES. Returns table names and types.read
describe-flink-tableGet full schema details for a Flink table via INFORMATION_SCHEMA.COLUMNS. Returns column names, data types (including $rowtime), nullability, and metadata column info.read
get-flink-table-infoGet table metadata via INFORMATION_SCHEMA.TABLES. Returns watermark configuration, distribution info, and table type.read
check-flink-statement-healthPerform an aggregate health check for a Flink SQL statement. Returns status (healthy/warning/critical), current phase, recent exceptions, and diagnostic details.read
detect-flink-statement-issuesDetect issues for a Flink SQL statement by analyzing status, exceptions, and performance metrics. Identifies problems like failures, backpressure, consumer lag, late data, memory issues, and provides suggested fixes.read
get-flink-statement-profileGet Query Profiler data for a Flink SQL statement. Returns the task graph with human-readable task/operator names, per-task metrics (records in/out, state size, busyness, idleness, backpressure, watermarks), and automated issue detection (backpressure bottlenecks, consumer lag, late data, large state).read
list-connectorsRetrieve a list of "names" of the active connectors. You can then make a read request for a specific connector by name.read
read-connectorGet information about the connector.read
create-connectorCreate a new connector. Returns the new connector information if successful.write
delete-connectorDelete an existing connector. Returns success message if deletion was successful.admin
search-topics-by-tagList all topics in the Kafka cluster with the specified tag.read
search-topics-by-nameList all topics in the Kafka cluster matching the specified name.read
create-topic-tagsCreate new tag definitions in Confluent Cloud.write
delete-tagDelete a tag definition from Confluent Cloud.admin
remove-tag-from-entityRemove tag from an entity in Confluent Cloud.write
add-tags-to-topicAssign existing tags to Kafka topics in Confluent Cloud.write
list-tagsRetrieve all tags with definitions from Confluent Cloud Schema Registry.read
list-clustersGet all clusters in the Confluent Cloud environmentread
list-environmentsGet all environments in Confluent Cloud with pagination supportread
read-environmentGet details of a specific environment by IDread
list-schemasList all schemas in the Schema Registry.read
delete-schemaDelete a schema subject or a specific version from the Schema Registry. If version is omitted, all versions of the subject are deleted.admin
create-tableflow-topicMake a request to create a tableflow topic.write
list-tableflow-regionsRetrieve a sorted, filtered, paginated list of all tableflow regions.read
list-tableflow-topicsRetrieve a sorted, filtered, paginated list of all tableflow topics.read
read-tableflow-topicMake a request to read a tableflow topic.read
update-tableflow-topicMake a request to update a tableflow topic.write
delete-tableflow-topicMake a request to delete a tableflow topic.admin
create-tableflow-catalog-integrationMake a request to create a catalog integration.write
list-tableflow-catalog-integrationsRetrieve a sorted, filtered, paginated list of all catalog integrations.read
read-tableflow-catalog-integrationMake a request to read a catalog integration.read
update-tableflow-catalog-integrationMake a request to update a catalog integration.write
delete-tableflow-catalog-integrationMake a request to delete a tableflow catalog integration.admin
list-billing-costsRetrieve billing cost data for a Confluent Cloud organization within a specified date range with pagination supportread
query-metricsQuery Confluent Cloud metrics from the Telemetry API. IMPORTANT: Use the list-available-metrics tool first to discover valid metric names and filter fields. Supports Kafka, Flink, Connectors, and Schema Registry metrics with flexible filtering, aggregation, and grouping.read
list-available-metricsList available Confluent Cloud metrics and their filter fields from the Telemetry API. Use this tool BEFORE query-metrics to discover valid metric names, resource filter fields, and grouping labels.read

Deploy Confluent MCP Server securely

CompleteFlow adds per-user authentication, permission scoping, and audit logging to any MCP server out of the box.

Deploy on CompleteFlow