Skip to content

Accessing the Cloud

This guide demonstrates how to interact with data using the supported cloud providers.


By following this guide, you will be able to access data from the cloud in the RKGS database and vice versa using the different available options.

This guide complements the Data Import and Export guides, where you can find all the relevant information for importing and exporting data.

Cloud Storage Providers

Currently, Azure Blob Storage (opens in a new tab) and Amazon Web Services (AWS) S3 (opens in a new tab) are supported.

ProviderURI PrefixDataRead-Only AccessWrite AccessSupported Regions
Azure Blob (opens in a new tab)azure://public & privateus-east-1
AWS S3 (opens in a new tab)s3://public & privateSee full list

Supported Regions

This section lists the supported storage regions for Azure Blob Storage and AWS S3.

Azure Blob Storage

These are the supported storage regions for Azure Blob:

RegionSupported Region Name
US East (N. Virginia)us-east-1


These are the supported storage regions for AWS S3:

RegionSupported Region Names
USus-east-1, us-east-2, us-west-1, and us-west-2
Asia Pacificap-east-1, ap-southeast-1, ap-southeast-2, ap-southeast-3, ap-southeast-4, ap-south-1, ap-south-2, ap-northeast-1, ap-northeast-2, and ap-northeast-3
Europeeu-central-1, eu-central-2, eu-west-1, eu-west-2, eu-west-3, eu-south-1, eu-south-2, and eu-north-1
South Americasa-east-1
Middle Eastme-south-1 and me-central-1
AWS Gov Cloudus-gov-east-1 and us-gov-west-1

Cloud Parameters

To interact with the cloud storage service, you need to define a module that specifies the data configuration. There are two relevant options for describing how to access the cloud:

pathA string that specifies the location and name of the file you want to import/export. Currently, this can point to azure://... (Microsoft Azure) or s3://... (Amazon S3) URLs.
integrationCredentials needed to access the data.

Public Data

If the cloud storage container provides public read access — for example, this file (opens in a new tab) — you only need to provide the path in the configuration option path.

module config
    def path = "s3://relationalai-documentation-public/csv-import/simple-import-4cols.csv"

Arbitrary URLs are not accepted. For security reasons, only "https://" URL addresses pointing to Microsoft Azure or Amazon S3 are supported.

Private Data

To access private data, you need to specify cloud credentials in the configuration option integration. The URL is provided via the option path, as with public data.

Azure Blob Storage

To access Azure Blob Storage (opens in a new tab), you must provide a valid SAS token.

Within the integration submodule, the following information needs to be provided:

providerCloud storage provider azure.
credentialsAccess token as a string, along with the token name :azure_sas_token.
module config
    def path = "azure://"
    module integration
        def provider = "azure"
        def credentials = (:azure_sas_token, raw"example%of%credentials")

The unescaped percent sign, %, is used for string interpolation within a Rel string. This means you need to store the URL and cloud credentials that contain % as raw strings. See the example above.


To access AWS S3 (opens in a new tab), you must provide a valid access key ID and a secret access key.

Within the integration submodule, you need to provide the following information:

regionS3 bucket region.
providerCloud storage provider s3.
credentialsAccess keys as a string, along with the token names, :access_key_id and :secret_access_key, respectively.
module config
    def path = "s3://my-s3-bucket/myfile.csv"
    module integration
        def region = "us-east-1"
        def provider = "s3"
        def credentials = (:access_key_id, raw"my_access_key_id")
        def credentials = (:secret_access_key, raw"my_secret_access_key")

See Also

Now that you know how to access a service cloud provider, you can check the Data Import and Export guides to learn how to interact with data. Check CSV Import and CSV Export for CSV data. If you are interested in JSON, see the JSON Import and JSON Export guides.

Was this doc helpful?