Skip to content

What's New in Version 1.0.5

March 17, 2026 6:06 PM UTC

Version 1.0.5 of the relationalai Python package is now available!

To upgrade, activate your virtual environment and run the following command:

Terminal window
pip install --upgrade relationalai
  • Added explicit schema= support to data() and Model.data(). Before, PyRel always inferred column types from the in-memory rows or DataFrame you provided, which made empty inputs and ambiguous columns harder to control. Now you can declare some or all column types explicitly, and the returned Data object preserves those types even when the input is empty.

    For example:

    from relationalai.semantics import Date, Integer, Model, String
    m = Model("OpsModel")
    # A daily exceptions feed can legitimately be empty, but downstream logic
    # still needs stable column types.
    shipment_exceptions = m.data(
    [],
    schema={
    "order_id": Integer,
    "warehouse": String,
    "reason": String,
    "requested_ship_date": Date,
    },
    )
    # You can still use the schema in model logic, even with no rows.
    ShipmentException = m.Concept("ShipmentException")
    m.define(ShipmentException.new(shipment_exceptions.to_schema()))
  • PyRel now supports batching multiple pending table exports into a single model execution. Before, export-heavy workflows had to execute table exports separately. Now you can queue multiple .into() exports and submit them together with m.exec():

    # Create two export fragments.
    m.select(Customer.id, Customer.name).into(customer_table)
    m.select(Product.id, Product.name).into(product_table)
    # Execute both exports together in a single model execution.
    m.exec()
  • Improved dtype inference for pandas float columns. Before, dataframe float columns could be inferred as a more generic numeric type. Now PyRel maps pandas float columns to Float, which keeps inferred schemas closer to the original dataframe types.

  • Fixed table-based schema and column mapping when model.implicit_properties is disabled. model.implicit_properties controls whether PyRel creates undeclared properties automatically. Before, when you turned this option off, Table objects could still implicitly create properties. Now Table objects respect the model.implicit_properties setting, so you can disable implicit properties for tables as well as concepts.

  • Fixed use of std.datetime.datetime.now() in queries and filters. Before, using the current date and time in a query or filter could fail with an argument error. Now you can use std.datetime.datetime.now() directly in PyRel logic, including in time-based derived definitions.

    For example:

    from relationalai.semantics import DateTime, Integer, Model, std
    from relationalai.semantics.std.datetime import datetime
    m = Model("OpsModel")
    # Declare the base shipment concept.
    Shipment = m.Concept("Shipment", identify_by={"id": Integer})
    # Add the promised ship timestamp.
    Shipment.promised_ship_at = m.Property(f"{Shipment} promised ship at {DateTime}")
    OverdueShipments = m.Concept("OverdueShipments", extends=[Shipment])
    # Define some shipment entities.
    m.define(
    Shipment.new(id=1, promised_ship_at=datetime(2026, 3, 17, 9, 0)),
    Shipment.new(id=2, promised_ship_at=datetime(2026, 3, 28, 9, 0)),
    )
    # Define overdue shipments using the current time.
    m.define(OverdueShipments(Shipment)).where(
    Shipment.promised_ship_at < std.datetime.datetime.now(),
    )
  • Fixed ProgrammaticAccessTokenAuth so PAT-based Snowflake authentication includes the required user field and expands ~ in token_file_path. Before, Snowflake could reject the PAT as invalid, and token files referenced with ~ could fail with FileNotFoundError. Now PAT authentication works in both cases.

  • Fixed Problem.solve() so it uses the app name from the model config. Before, Problem.solve() could ignore a custom rai_app_name in the model config. Now it reuses the configured app name.

  • Fixed create_config() for PyRel running inside an existing Snowflake session, including notebooks, stored procedures, and SPCS containers. Before, the returned config could fail when PyRel later needed a Snowflake SQL connection. Now configs created from active sessions work correctly in those environments.

  • Fixed notebook diagnostics so notebook users see all collected PyRel diagnostics in a single exception. Before, earlier diagnostics could be easy to miss, so you often saw only the final failure. Now notebook exceptions include the diagnostic context that led to the error.