Snowflake Intelligence Agent — RelationalAI Knowledge Graph
Scaffold for packaging a RelationalAI semantic model as a Snowflake Cortex agent and exposing it through Snowflake Intelligence.
What this template is for
RelationalAI lets you map Snowflake tables into a semantic layer — concepts, properties, and relationships — that executes directly inside Snowflake. This template wires that semantic layer into a Snowflake Cortex agent so users can ask natural-language questions about their data from anywhere in Snowflake Intelligence.
The scaffold provides the full lifecycle tooling: define your model, test queries locally, deploy stored procedures and the agent, push updates, and tear everything down — all from a single CLI script.
Who this is for
- Engineers and data modelers who have a Snowflake table they want to expose conversationally
- Teams building or evaluating RAI-powered Snowflake Intelligence integrations
- Assumed knowledge: basic Python, familiarity with Snowflake, some exposure to PyRel concepts (concepts, properties, model.define)
What you’ll build
- A PyRel semantic model (ontology) that maps your source table to typed concepts and properties
- A set of curated analytical queries the agent can execute on demand
- A deployed Snowflake Cortex agent with stored procedures backed by the model
- A Snowflake Intelligence integration that lets users ask natural-language questions against your data
What’s included
- Model (
ontology.example.py): Example concept and property definitions with a parent/child hierarchy and label tagging pattern - Queries (
queries.example.py): Two example queries — category breakdown and label traceability — plus theToolRegistrywiring - Runner (
si_agent.py): CLI for deploy / update / status / chat / teardown - Config template (
rai-agent-config.example.yaml): All instance-specific values in one place - Local smoke test (
test_queries.example.py): Runs both queries directly against Snowflake before deploying - Working example (
example/): End-to-end employee directory implementation you can run immediately with the included sample data — useful as a reference before building your own model
Prerequisites
Access
- RelationalAI native app installed in your Snowflake account
ACCOUNTADMINrole (or equivalent) withCREATE PROCEDURE,CREATE STAGE, andCREATE AGENTon the target schema- The target schema must exist before deploying
- Change tracking enabled on the source table:
ALTER TABLE <YOUR_DB>.<YOUR_SCHEMA>.<YOUR_TABLE> SET CHANGE_TRACKING = TRUE;
Tools
- Python 3.10+
relationalai>=1.0.12,httpx,pyyaml(seepyproject.toml)
Quickstart
New to this scaffold? The
example/directory contains a fully-wired employee directory implementation with sample data you can deploy immediately. Follow the example README to get your first agent running, then come back here to build your own model.
-
Install dependencies
Terminal window python -m venv .venvsource .venv/bin/activatepip install -e . -
Copy and fill in config files
Terminal window cp rai-agent-config.example.yaml rai-agent-config.yamlcp ontology.example.py ontology.pycp queries.example.py queries.pycp test_queries.example.py test_queries.pyEdit
rai-agent-config.yamlwith your agent name, database, schema, warehouse, model name, and source table.Create
raiconfig.yamlwith your Snowflake credentials (see Configuration below). -
Define your model
Edit
ontology.py— replace theNode/Edgeexample with concepts and properties that match your source table. -
Define your queries
Edit
queries.py— replace or extend the example query functions. Each function’s docstring is what the LLM sees to decide when to call it. -
Test locally
Terminal window RAI_CONFIG_FILE_PATH=raiconfig.yaml python test_queries.py -
Deploy
Terminal window python si_agent.py deploypython si_agent.py status -
Test a question
Terminal window python si_agent.py chat "what can I ask about?" -
Expected output
Deploying 'MY_AGENT_NAME' to MY_DATABASE.MY_SCHEMA ...Agent MY_AGENT_NAME: ACTIVE (2 tools registered)
Template structure
.├── README.md # this file├── pyproject.toml # dependencies├── si_agent.py # deployment CLI — start here├── rai-agent-config.example.yaml # config template → copy to rai-agent-config.yaml├── ontology.example.py # model template → copy to ontology.py├── queries.example.py # query template → copy to queries.py├── test_queries.example.py # local smoke test → copy to test_queries.py└── example/ # working employee directory example — try this first ├── README.md # quickstart for the example ├── data/employees.csv # 12-row sample dataset ├── ontology.py # Employee + ReportsTo concepts ├── queries.py # headcount_by_department + direct_reports ├── test_queries.py # local smoke test └── rai-agent-config.example.yamlStart here: python si_agent.py deploy
Configuration
rai-agent-config.yaml
Single source of truth for all instance-specific values.
agent: name: MY_AGENT_NAME # Name shown in Snowflake Cortex Agents UI database: MY_DATABASE # Snowflake database to deploy into schema: MY_SCHEMA # Schema to deploy into (must already exist) warehouse: MY_WAREHOUSE # Warehouse for sproc execution model_name: MY_MODEL_NAME # RAI Model name used internally in each sproc
model: source_table: MY_DB.MY_SCHEMA.MY_TABLE # Fully-qualified source tableraiconfig.yaml (gitignored)
Snowflake connection credentials used locally and during deployment.
connections: sf: account: # Snowflake account identifier user: # Your Snowflake username password: # Your password warehouse: # Warehouse for query execution role: # Must have CREATE PROCEDURE, CREATE STAGE, CREATE AGENT rai_app_name: # Name of the RelationalAI native app (usually RELATIONALAI)Model overview
The example ontology models a hierarchical dataset (e.g. a bill-of-materials or org structure) sourced from a single Snowflake table.
- Key entities:
Node(a record in the source table),Edge(a parent→child link between nodes) - Primary identifiers:
Nodeis identified byid(string);Edgeby(parent_id, child_id) - Important invariants: null
idrows are silently dropped by null propagation — do not coalesce nulls to a sentinel
Concepts
Node — Represents a record in the source table, deduplicated by id.
| Property | Type | Identifying? | Notes |
|---|---|---|---|
id | string | Yes | Loaded from parent_id and child_id columns |
category | string | No | Required for per()-based aggregation queries |
label | string | No | Optional classification tag; used by label_trace query |
Edge — Represents a directed parent→child link between two nodes.
| Property | Type | Identifying? | Notes |
|---|---|---|---|
parent_id | string | Yes | FK to Node.id |
child_id | string | Yes | FK to Node.id |
How it works
ontology.pymaps source table rows intoNodeandEdgeconcepts viamodel.define()queries.pydefines query functions that returnrai.Fragmentobjects; each function’s docstring is verbalized to the LLMsi_agent.py deploypackagesontology.pyandqueries.pyas Snowflake stored procedures and registers a Cortex agent- Snowflake Intelligence routes natural-language questions to the agent, which selects and executes the appropriate query
- Results are verbalized and returned as a natural-language response
Snowflake table → ontology.py → RAI model → queries.py → stored procs → Cortex agent → Snowflake IntelligenceCustomize this template
Use your own data
- Set
model.source_tableinrai-agent-config.yamlto your fully-qualified table name - In
ontology.py, replaceNode/Edgewith concepts that match your domain - Column names in
model.define()must match your actual table columns
Add a new query
- Write a function in
queries.pyreturning arai.Fragmentwith a clear docstring - Bind and register it in
build_tool_registry→QueryCatalog - Test locally:
python test_queries.py - Push to Snowflake:
python si_agent.py update
Seed labels
In ontology.py inside initialize():
node = Node.ref()model.define(node.label("MY_LABEL")).where(node.id == "some-identifier")Then run python si_agent.py update.
Scale up / productionize
- Pin
relationalaito a specific version inpyproject.tomlfor reproducible deploys - Run
si_agent.py updatein CI after merging changes toontology.pyorqueries.py - Use a dedicated warehouse sized for your query volume
Troubleshooting
Why did deployment fail with a permissions error?
- Confirm your Snowflake role has
CREATE PROCEDURE,CREATE STAGE, andCREATE AGENTon the target schema. - Verify the target schema already exists — the deploy step does not create it automatically.
Why is my source table not found?
- Check that
model.source_tableinrai-agent-config.yamlis fully qualified (DB.SCHEMA.TABLE). - Confirm change tracking is enabled:
ALTER TABLE ... SET CHANGE_TRACKING = TRUE.
Why do local test queries return duplicate rows?
- This is a known local-execution quirk:
.to_df()may return one row per(node, category)becausenodestays a free variable. Add.drop_duplicates()as shown intest_queries.example.py. The deployed sproc returns correctly aggregated results.
Why did the sproc fail with an httpx import error?
httpxis not auto-installed as a transitive dep in Snowflake sprocs. It is explicitly added via_EXTRA_PACKAGESinsi_agent.py— ensure you have not removed it.
Why does the agent appear inactive after deploy?
- Run
python si_agent.py statusto see the full deployment state. - If the agent shows as pending, wait a few seconds and re-check — Cortex agent registration is asynchronous.
Known workarounds
| Issue | Workaround applied |
|---|---|
snowflake-telemetry-python conflicts with relationalai>=1.0.x opentelemetry dependency | Remove snowflake-telemetry-python from the sproc package list in the installed library (relationalai/agent/cortex/cortex_tool_resources.py ~line 278). Re-apply after upgrading relationalai. |
httpx not auto-installed as a transitive dep in Snowflake sprocs | Added "httpx" to _EXTRA_PACKAGES in si_agent.py. |
Local .to_df() returns one row per node for count queries | Added .drop_duplicates() in test_queries.example.py. |
Learn more
Core concepts
- Build a semantic model — How
model.Concept,model.Property, andmodel.define()work - Cortex Agents overview — Snowflake’s agent framework
Language / modeling reference
- PyRel documentation — Tutorials and guides
- PyRel API documentation — Full language reference for
relationalaiai
Deeper dives
- Snowflake Intelligence — How to promote a Cortex agent to Snowflake Intelligence