Accessing the Cloud
This guide demonstrates how to interact with data using the supported cloud providers.
By following this guide, you will be able to access data from the cloud in the RKGS database and vice versa using the different available options.
This guide complements the Data Import and Export guides, where you can find all the relevant information for importing and exporting data.
|Provider||URI Prefix||Data||Read-Only Access||Write Access||Supported Regions|
|Azure Blob (opens in a new tab)||azure://||public & private||✅||✅|
|AWS S3 (opens in a new tab)||s3://||public & private||✅||✅||See full list|
This section lists the supported storage regions for Azure Blob Storage and AWS S3.
These are the supported storage regions for Azure Blob:
|Region||Supported Region Name|
|US East (N. Virginia)|
These are the supported storage regions for AWS S3:
|Region||Supported Region Names|
|AWS Gov Cloud|
To interact with the cloud storage service, you need to define a module that specifies the data configuration. There are two relevant options for describing how to access the cloud:
|A string that specifies the location and name of the file you want to import/export. Currently, this can point to |
|Credentials needed to access the data.|
If the cloud storage container provides public read access — for example, this file (opens in a new tab) — you only need to provide the path in the configuration option
module config def path = "s3://relationalai-documentation-public/csv-import/simple-import-4cols.csv" end
Arbitrary URLs are not accepted. For security reasons, only
"https://" URL addresses pointing to Microsoft Azure or Amazon S3 are supported.
To access private data, you need to specify cloud credentials in the configuration option
The URL is provided via the option
path, as with public data.
To access Azure Blob Storage (opens in a new tab), you must provide a valid SAS token.
integration submodule, the following information needs to be provided:
|Cloud storage provider |
|Access token as a string, along with the token name |
module config def path = "azure://myaccount.blob.core.windows.net/sascontainer/myfile.csv" module integration def provider = "azure" def credentials = (:azure_sas_token, raw"example%of%credentials") end end
To access AWS S3 (opens in a new tab), you must provide a valid access key ID and a secret access key.
integration submodule, you need to provide the following information:
|S3 bucket region.|
|Cloud storage provider |
|Access keys as a string, along with the token names, |
module config def path = "s3://my-s3-bucket/myfile.csv" module integration def region = "us-east-1" def provider = "s3" def credentials = (:access_key_id, raw"my_access_key_id") def credentials = (:secret_access_key, raw"my_secret_access_key") end end
Now that you know how to access a service cloud provider, you can check the Data Import and Export guides to learn how to interact with data. Check CSV Import and CSV Export for CSV data. If you are interested in JSON, see the JSON Import and JSON Export guides.