Snowflake’s managed MCP servers accept external OAuth tokens. I tested this end-to-end — tool discovery, SQL execution, and RBAC enforcement — using JWT-based external OAuth with custom-signed tokens. It works, and it means MCP servers can be accessed by any external AI agent that can obtain an OAuth token from your identity provider.
The test code is on GitHub: sfc-gh-kkeller/mcp_testing_oauth.
Why This Matters#
MCP is becoming the standard protocol for AI agents to access data sources and tools. Snowflake’s managed MCP servers expose tools like SYSTEM_EXECUTE_SQL — letting agents run queries against Snowflake through a standard JSON-RPC interface.
But for enterprise use, agents need to authenticate through your existing identity infrastructure — Okta, Entra ID, Ping, or your own authorization server. External OAuth makes this possible: the agent obtains a JWT from your IdP, passes it to the MCP server, and Snowflake validates the token, maps it to a user and role, and enforces RBAC on every tool call.
This is the same authentication pattern used for Snowflake drivers and connectors, now confirmed working for MCP servers.
The Authentication Flow#
┌────────────────────────────┐
│ External OAuth Token │
│ (JWT signed with RSA key) │
│ │
│ iss: <your_issuer> │
│ aud: <your_audience> │
│ scp: session:role:<ROLE> │
│ name: <SF_LOGIN_NAME> │
└────────────┬───────────────┘
│
▼
┌─────────────────────────────────────────────────────────┐
│ Snowflake External OAuth Security Integration │
│ │
│ - Validates JWT signature against RSA public key │
│ - Maps token claims to Snowflake user + role │
│ - EXTERNAL_OAUTH_ANY_ROLE_MODE = ENABLE │
└────────────────────────┬────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────┐
│ Managed MCP Server │
│ (DB.SCHEMA.YOUR_MCP_SERVER) │
│ │
│ Endpoint: POST /api/v2/databases/{db}/schemas/{schema} │
│ /mcp-servers/{name} │
│ │
│ Tool: sql-exec-tool (SYSTEM_EXECUTE_SQL) │
│ RBAC: Role from token must have USAGE on MCP server │
└─────────────────────────────────────────────────────────┘The token carries the role in its scp claim (format: session:role:MY_ROLE). Snowflake maps this to the corresponding Snowflake role — the same RBAC enforcement that governs SQL queries, Cortex AI calls, and every other Snowflake operation. The MCP server doesn’t bypass governance.
Setup#
1. Create the External OAuth Security Integration#
CREATE OR REPLACE SECURITY INTEGRATION my_ext_oauth
TYPE = EXTERNAL_OAUTH
ENABLED = TRUE
EXTERNAL_OAUTH_TYPE = CUSTOM
EXTERNAL_OAUTH_ISSUER = 'https://your-idp.example.com'
EXTERNAL_OAUTH_RSA_PUBLIC_KEY = 'MIIBIjANBgkqhki...'
EXTERNAL_OAUTH_TOKEN_USER_MAPPING_CLAIM = 'name'
EXTERNAL_OAUTH_SNOWFLAKE_USER_MAPPING_ATTRIBUTE = 'login_name'
EXTERNAL_OAUTH_SCOPE_MAPPING_ATTRIBUTE = 'scp'
EXTERNAL_OAUTH_ANY_ROLE_MODE = 'ENABLE';2. Create a UDF to Generate Tokens#
This UDF mints JWTs signed with the RSA private key matching the public key in the integration. Useful for testing — in production, your IdP issues the tokens.
CREATE OR REPLACE FUNCTION generate_token_test()
RETURNS VARCHAR
LANGUAGE PYTHON
RUNTIME_VERSION = '3.11'
PACKAGES = ('pyjwt','cryptography')
HANDLER = 'udf'
AS $$
import jwt
from datetime import datetime, timedelta
def udf():
private_key = b"""-----BEGIN RSA PRIVATE KEY-----
<your_rsa_private_key>
-----END RSA PRIVATE KEY-----"""
now = datetime.utcnow()
encoded = jwt.encode({
"iss": "<your_issuer_url>",
"aud": "<your_audience_url>",
"scp": "session:role:<YOUR_ROLE>",
"name": "<YOUR_SNOWFLAKE_LOGIN_NAME>",
"iat": now,
"exp": now + timedelta(minutes=100)
}, private_key, algorithm="RS256")
return encoded
$$;3. Create the Managed MCP Server#
CREATE OR REPLACE MCP SERVER my_db.my_schema.my_mcp_server
FROM SPECIFICATION $$
tools:
- title: "SQL Execution Tool"
name: "sql-exec-tool"
type: "SYSTEM_EXECUTE_SQL"
description: "Execute SQL queries against Snowflake."
$$;4. Grant RBAC Permissions#
The role encoded in the token’s scp claim needs access to the MCP server and its parent database/schema:
GRANT USAGE ON DATABASE my_db TO ROLE my_agent_role;
GRANT USAGE ON SCHEMA my_db.my_schema TO ROLE my_agent_role;
GRANT USAGE ON MCP SERVER my_db.my_schema.my_mcp_server TO ROLE my_agent_role;5. Call the MCP Server#
# Generate a token via the UDF
TOKEN=$(snowsql -c my_conn -q "SELECT generate_token_test()" \
-o output_format=plain -o header=false)
# Discover available tools
curl -s -X POST \
"https://<account>.snowflakecomputing.com/api/v2/databases/MY_DB/schemas/MY_SCHEMA/mcp-servers/MY_MCP_SERVER" \
-H "Content-Type: application/json" \
-H "Accept: application/json" \
-H "Authorization: Bearer $TOKEN" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list"}'
# Execute a SQL query
curl -s -X POST \
"https://<account>.snowflakecomputing.com/api/v2/databases/MY_DB/schemas/MY_SCHEMA/mcp-servers/MY_MCP_SERVER" \
-H "Content-Type: application/json" \
-H "Accept: application/json" \
-H "Authorization: Bearer $TOKEN" \
-d '{
"jsonrpc":"2.0",
"id":2,
"method":"tools/call",
"params":{
"name":"sql-exec-tool",
"arguments":{"sql":"SELECT CURRENT_USER(), CURRENT_ROLE()"}
}
}'What Was Tested#
The test script runs three tests end-to-end:
| Test | Method | Result |
|---|---|---|
| Tool discovery | tools/list via JSON-RPC POST | 200 OK — returned tool schema |
| Tool invocation | tools/call (SYSTEM_EXECUTE_SQL) | 200 OK — executed SQL, returned results |
| RBAC enforcement | Revoke USAGE, call again, re-grant | JSON-RPC error: “does not exist or not authorized” |
The third test is the critical one — it proves RBAC is enforced dynamically. The test revokes USAGE on the MCP server from the token’s role, makes the same call (which now fails with a JSON-RPC error), then re-grants the permission. This confirms that the MCP server respects Snowflake’s authorization model in real-time, not just at token validation.
Gotchas#
A few things I learned the hard way while testing:
| Issue | What Happens | Fix |
|---|---|---|
Missing Accept header | Error 391902 | Always include Accept: application/json |
| Auth failure response | HTTP 200 with JSON-RPC error -32603, not 401/403 | Check the JSON-RPC error body, not the HTTP status |
| Wrong role in token | MCP server “doesn’t exist” error | Token’s scp must match a role with USAGE grants on the MCP server |
| SQL parameter name | Tool expects sql, not statement | Use "arguments":{"sql":"..."} |
| Account URL format | Connection fails with underscores | Use hyphens in the hostname (e.g., my-account not my_account) |
ZTA Implications#
This confirms that managed MCP servers sit inside Snowflake’s Zero Trust enforcement stack:
- Identity — external OAuth tokens are validated against the security integration’s RSA public key
- Authorization — the token’s role claim maps to Snowflake RBAC; the role must have explicit USAGE grants on the MCP server
- Network — the MCP endpoint respects account-level network policies and PrivateLink
- Audit — every MCP tool call is logged in QUERY_HISTORY with the authenticated user and role
- Data governance — SQL executed through MCP is subject to masking policies, row access policies, and Horizon tagging
External AI agents connecting via OAuth get the same governance as any other Snowflake client. No backdoor, no bypass.
Running the Tests#
git clone https://github.com/sfc-gh-kkeller/mcp_testing_oauth.git
cd mcp_testing_oauth
pixi install
pixi run testOr with custom configuration:
SNOWFLAKE_CONNECTION_NAME=my_connection \
MCP_DATABASE=MY_DB \
MCP_SCHEMA=MY_SCHEMA \
MCP_SERVER_NAME=MY_MCP_SERVER \
MCP_TOKEN_ROLE=MY_ROLE \
SNOWFLAKE_ACCOUNT_URL=https://myorg-myaccount.snowflakecomputing.com \
pixi run test| Variable | Default | Description |
|---|---|---|
SNOWFLAKE_CONNECTION_NAME | default | Connection from ~/.snowflake/connections.toml |
SNOWFLAKE_ACCOUNT_URL | — | Full account URL |
MCP_DATABASE | MY_DB | Database containing the MCP server |
MCP_SCHEMA | MY_SCHEMA | Schema containing the MCP server |
MCP_SERVER_NAME | MY_MCP_SERVER | Managed MCP server name |
MCP_TOKEN_UDF | <db>.<schema>.generate_token_test | UDF that returns an OAuth token |
MCP_TOKEN_ROLE | MY_ROLE | Role for the RBAC revoke/re-grant test |
