Skip to main content

Amazon S3 for data import Data Source

An Amazon S3 for data import Data Source pulls JSON, YAML, XML, and .tfstate files from an Amazon S3 bucket and recursively into directories, and loads the files into Styra DAS. It uses Rego for transformation or filtering on data before it is loaded into Styra DAS. It authenticates using IAM access key and secret access key that is stored as a secret in Styra DAS.

Creating or Configuring the Data Source through the Styra DAS UI

Create or configure the Data Source through the Styra DAS UI.

  1. Login to the Styra DAS UI.
  2. Select the System to add the Data Source.
  3. Click the kebab icon (three dots ⋮) to the right of the System and select Add Data Source. The Add Data Source dialog box appears.
  4. Select Amazon S3 for JSON object import.
  5. In Path type a new or existing path separated by /. For example, datasourcetypes.
  6. In Data Source name (required) type the name for the Data Source.
  7. (Optional) Type in a Description.
  8. In AWS region (required) select your AWS region from the drop-down selection.
  9. In Bucket Name (and Path) (required) type the bucket name and optionally a path to the bucket. For example, amazon-s3-bucket-testing. For more information on how to setup an AWS user and S3 bucket for secure Styra DAS to Amazon S3 access, see Amazon S3 Bucket Access page.
    • If only one file is returned from Amazon S3 then the result contains the content of that file. For example, if the bucket name and path is tests3/test.json the result is {"foo": "bar"}.
    • If multiple files are returned from Amazon S3 then the result will have additional layers with the full folder structure and file names to avoid collisions. For example, if the bucket name and path is bucket and path: tests3/data the result is {"data": {"file.json": {"foo": "bar"}}}.
  10. In Endpoint override type a gateway endpoint. For more information, see Amazon S3 Endpoints.
  11. In Refresh interval type a refresh interval which is the amount of time between polling intervals. Default is s.
  12. In Access Keys for IAM Users type the following access key credentials.
    • In Access Key ID (required) type the access key ID. For more information, see AWS IAM User Access Keys.
    • In Secret Access Key (required) type the Styra DAS secret you are using for an Amazon S3 bucket within your own AWS account.
  13. (Optional) Click the arrow to expand the Advanced field.
  14. In Data transform specify a policy and write a query that allows you to apply Rego transformations before it is persisted as data. For example, Select Custom and fill in the following fields:
    • In Policy type an existing policy separated by /. For example, transform/transform.rego.
    • In Rego querytype a path to the Rego rule to evaluate. For example, data.transform.query.
  15. Leave the Enable on-premises data source agent switch off.
  16. Preview the Data Source in the right pane. If the data is over 1 MB, the Preview will display an error.
  17. ClickAdd.

The following shows an example output which appears after the data source is created in DAS.

"data": {
"s3-test.json": {
"foo1": "bar1"
"s3-test.yaml": {
"foo3": "bar3"
"s3-test.yml": {
"foo2": "bar2"