Skip to content

cortex

relationalai.agent

RelationalAI-Cortex Integration

Deploy Snowflake Cortex AI agents powered by RelationalAI semantic models for Snowflake Intelligence.

from snowflake import snowpark
from relationalai.semantics import Model
from relationalai.config import create_config, SnowflakeConnection
from relationalai.agent.cortex import CortexAgentManager, DeploymentConfig, discover_imports, ToolRegistry
# Create session using role for deployment
session: snowpark.Session = create_config().get_session(SnowflakeConnection)
# Configure manager
manager = CortexAgentManager(
session=session,
config=DeploymentConfig(
agent_name="MY_ASSISTANT", # Unique name for the agent
database="MY_DB", # Snowflake database for deployment (sprocs & agent)
schema="MY_SCHEMA", # Snowflake schema for deployment (sprocs & agent)
warehouse="COMPUTE_WH")) # Warehouse for RAI tool execution (SI users need USAGE)
# Your model definition function
def initialize(m: Model):
# Define your entities, relationships, and computed properties
...
# Initialize RAI Tools
def init_tools(model: Model):
initialize(model) # Initialize RAI Model with your semantics
return ToolRegistry().add(
model=model, # Expose model through tools
description="...")
# Deploy
manager.deploy(
init_tools=init_tools, # Initialize RAI Tools
imports=discover_imports()) # Specify local python modules to package into sproc
print(manager.status())

After deployment, you can find the Cortex Agent in the Snowflake UI under AI & ML > Agents. The UI allows you to chat with the agent, preview or add it to Snowflake Intelligence, and view traces of conversations the agent participates in.

For programmatic access (testing, automation):

chat = manager.chat()
response = chat.send("What can I ask about?")
print(response.full_text())

Conversations through the programmatic interface are persisted and available through the Monitoring tab for the agent in the Snowflake UI.

The RelationalAI-Cortex integration

  1. operationalizes your RAI Model into Snowflake stored procedures
  2. configures a Cortex Agent with these tools, and instructions on how to explore your semantic models and answer questions about your data.
ToolPurpose
RAI_DISCOVER_MODELSDiscover available models and their key concepts
RAI_VERBALIZE_MODELGet detailed model structure and relationships
RAI_EXPLAIN_CONCEPTUnderstand business rules for specific concepts
RAI_QUERY_MODELExecute pre-defined queries (PREVIEW — requires allow_preview=True)

All stored procedures are created with CALLER’S RIGHTS — they execute under the SI user’s Snowflake role, not the deployer’s role. This means:

  • Data access is governed by the caller’s role. Users only see data their role can SELECT.
  • No privilege escalation. The agent never operates with more permissions than the person using it.

This makes Snowflake’s existing RBAC the single source of truth for data governance across all agent interactions.

The following permissions are required for two roles:

  1. deployer/admin who creates the Cortex Agent
  2. SI users who interact with it through the Snowflake Intelligence UI (or, programmatic users of the Cortex Agent)

Both require the rai_developer role.

The rai_developer role is created during RAI Native App installation. It grants access to RAI and includes the following privileges by default:

PrivilegePurpose
USAGE on S3_RAI_INTERNAL_BUCKET_EGRESS_INTEGRATIONExternal access for RAI stored procedures
USAGE on PYPI_ACCESS_INTEGRATIONAccess to Python packages during sproc execution

These are necessary to execute the Snowflake stored procedures.

PrivilegePurpose
CREATE STAGE on target schemaStore sproc dependencies (if manage_stage=True)
CREATE PROCEDURE on target schemaRegister RAI tool sprocs
CREATE AGENT on target schemaRegister the Cortex agent
database role snowflake.cortex_userAccess Cortex services
application role snowflake.ai_observability_events_lookupMonitor AI observability events
database role snowflake.pypi_repository_userInstall Python packages in the sproc environment
rai_developer roleAccess RAI (see above)
USAGE on database and schemaAccess the deployment target

Because sprocs use CALLER’S RIGHTS, SI users need privileges on the resources the agent accesses at runtime:

PrivilegePurpose
USAGE on warehouseExecute sprocs (warehouse from DeploymentConfig, or session default)
database role snowflake.cortex_userAccess Cortex services
database role snowflake.pypi_repository_userInstall Python packages in the sproc environment
rai_developer roleAccess RAI (see above)
USAGE on database and schemaAccess the data
SELECT on tablesRead data accessed by the model
EXECUTE on stored proceduresInvoke RAI tools
-- Deployer role
create role my_deployer_role;
grant create stage on schema my_db.my_schema to role my_deployer_role;
grant create procedure on schema my_db.my_schema to role my_deployer_role;
grant create agent on schema my_db.my_schema to role my_deployer_role;
grant database role snowflake.cortex_user to role my_deployer_role;
grant database role snowflake.pypi_repository_user to role my_deployer_role;
grant application role snowflake.ai_observability_events_lookup to role my_deployer_role;
grant role rai_developer to role my_deployer_role;
grant usage on database my_db to role my_deployer_role;
grant usage on schema my_db.my_schema to role my_deployer_role;
-- SI user role
create role my_si_user_role;
grant usage on warehouse compute_wh to role my_si_user_role;
grant database role snowflake.cortex_user to role my_si_user_role;
grant database role snowflake.pypi_repository_user to role my_si_user_role;
grant role rai_developer to role my_si_user_role;
grant usage on database my_db to role my_si_user_role;
grant usage on schema my_db.my_schema to role my_si_user_role;
grant select on all tables in schema my_db.my_schema to role my_si_user_role;
grant execute on all procedures in schema my_db.my_schema to role my_si_user_role;
ParameterRequiredDefaultDescription
databaseYes-Snowflake database where agent will be deployed
schemaYes-Snowflake schema where agent will be deployed. To enable interaction through the Snowflake Intelligence UI, deploy to the schema configured for your account (typically AGENTS)
agent_nameYes-Unique name for the Cortex agent within the schema
model_nameNoSame as agent_nameName for the Model instance created inside each stored procedure
warehouseNoNoneSnowflake warehouse the Cortex agent will use when executing RAI tools. SI users need USAGE on this warehouse. If omitted, tools use the caller’s session warehouse. Stored procedures are created with CALLER’S RIGHTS
stage_nameNo"rai_sprocs"Name of the Snowflake stage for storing sproc dependencies
manage_stageNoTrueIf True, automatically create/drop the stage during deploy/cleanup. Set to False if using a pre-existing stage
llmNo"claude-sonnet-4-5"Language model for agent orchestration. Must be available in Snowflake Cortex
query_timeout_sNo300Timeout in seconds for stored procedure execution
budget_secondsNoNoneTime budget in seconds for agent execution. Can be used with budget_tokens
budget_tokensNoNoneToken budget for model consumption. Can be used with budget_seconds
external_access_integrationNo"S3_RAI_INTERNAL_BUCKET_EGRESS_INTEGRATION"External access integration for sprocs. USAGE is granted by the rai_developer role by default
artifact_repositoryNo"snowflake.snowpark.pypi_shared_repository"Artifact repository for Python packages
allow_previewNoFalseAllow Preview capabilities (e.g., queries)

Verbalizers control how model structure is presented to the agent.

Returns relationship readings extracted from the RAI model:

Customer has many Orders
Order has one Customer
Order contains many OrderItems
...

This is the default — no configuration needed.

Extends ModelVerbalizer: explain_model returns the standard relationship readings, while explain_concept returns the Python source code from the modules you provide, filtered to those that reference the requested concept.

Pass the model as the first argument, followed by your model definition functions:

from relationalai.agent.cortex import SourceCodeVerbalizer
def init_tools(model: Model):
init_model(model)
return ToolRegistry().add(
model=model,
description="Customers and orders",
verbalizer=SourceCodeVerbalizer(model, init_model)
)

The agent sees actual Python code for specific concepts, including business logic and computed properties.

Comments are included, so clarifications and justifications of your code will benefit the agent as well.

Note: The queries capability is in PREVIEW. Deployment requires allow_preview=True. Please contact us first!

Pre-defined queries let domain experts create common analytical queries that agents can discover and execute.

Create a class to hold your queries with the model as a dependency:

import relationalai.semantics as rai
class CustomerAnalysis:
def __init__(self, m):
self.m = m
def segment_summary(self) -> rai.Fragment:
customer = self.m.Customer.ref()
segment = self.m.Customer.ValueSegment.ref()
order = self.m.Order.ref()
g = rai.per(segment)
return self.m.select(
segment.name.alias("segment"),
g.sum(customer.ltv).alias("revenue"),
g.sum(order.profit)
.where(customer.order(order))
.alias("profit")
).where(
customer.value_segment(segment)
)
from relationalai.agent.cortex import QueryCatalog
def init_tools(model: Model):
init_model(model)
queries = CustomerAnalysis(model)
return ToolRegistry().add(
model=model,
description="Customers and orders",
verbalizer=SourceCodeVerbalizer(model, init_model),
queries=QueryCatalog(queries.segment_summary)
)
# Deploy with preview enabled — set allow_preview on DeploymentConfig
manager = CortexAgentManager(
session=session,
config=DeploymentConfig(
agent_name="MY_ASSISTANT",
database="EXAMPLE",
schema="CORTEX",
warehouse="COMPUTE_WH",
allow_preview=True,
)
)
manager.deploy(
init_tools=init_tools,
imports=discover_imports(),
)

The agent can discover these queries and execute them with appropriate parameters based on user requests.

When deploying, you need to provide code that runs inside Snowflake stored procedures. There are two parameters for this:

The imports parameter packages your local Python files (model definitions, queries, etc.) and uploads them to Snowflake. Use discover_imports() to automatically find all modules imported by your init_tools function:

manager.deploy(
init_tools=init_tools,
imports=discover_imports() # Packages your project code
)

discover_imports() recursively discovers all local imports starting from the calling file. It excludes standard library and installed packages.

The extra_packages parameter specifies PyPI packages that Snowflake will install in the stored procedure environment. Use this for third-party libraries your code depends on:

manager.deploy(
init_tools=init_tools,
imports=discover_imports(),
extra_packages=["pandas==2.0.0", "numpy"] # Installed by Snowflake
)

Note: relationalai is included automatically.

Update tool definitions without recreating the agent:

def init_tools_v2(model: Model):
# Updated tool definitions
...
manager.update(init_tools=init_tools_v2, imports=discover_imports())

Check deployment status or tear down all resources:

print(manager.status()) # Reports what exists (agent, stage, sprocs)
manager.cleanup() # Drops agent, sprocs, and stage — permanently loses SI conversation history

Classes exposed by this module.

Submodules and subpackages available under this namespace.