Skip to main content

Amazon S3 Data Source

An Amazon S3 data source pulls JSON and YAML files from an Amazon S3 bucket and recursively into directories, and loads the files into Styra DAS. It uses Rego for transformation or filtering on data before it is loaded into Styra DAS. It authenticates using IAM access key and secret access key that is stored as a secret in Styra DAS.

Configure through the Styra DAS UI

The following section helps you to configure <das-id>.styra.com to access a data source stored in Amazon S3 using the Styra DAS UI.

Create a Styra DAS System

Go to <das-id>.styra.com. To add a new system, click the ( ⨁ ) plus icon next to SYSTEMS on the left side of the navigation panel.

Fill in the following fields:

  • System type (required): Select any system type from the drop down list. For example, Custom.

  • System name (required): A user-friendly name so that you can distinguish between the different systems.

  • Description (optional): More details about this system.

  • Leave the Show errors switch ON to display the errors.

  • Click Add system button.

The Styra DAS System is created under the SYSTEMS on the left side of the navigation panel.

Add a Data Source

After you create your system, click the three dots () next to it and select Add Data Source to start configuring the data source.

Figure 1 - Add Data SourceFigure 1 - Add Data Source

Now, your Custom System >> Add Data Source dialog appears.

Figure 2 - Add Data Source WindowFigure 2 - Add Data Source Window

Complete the following steps in your Custom System >> Add Data Source dialog box.

  1. Type: An editable data source that you fill in with JSON data and publish. Click the down arrow to select the data source type. For example, select AWS S3 for JSON object import to pull a JSON object from a specific AWS S3 bucket. This refreshes regularly.

    Figure 3 - Data Source TypeFigure 3 - Data Source Type

  2. Path: Enter a new or existing path separated by /. For example, am/datasourcetypes.

  3. Data source name (required): Enter a name for the data source type. For example, am-aws-s3.

  4. Description: This field is optional.

  5. AWS region (required): A string representing the AWS region. Select one of the regions from AWS service Endpoints. For example, us-east-1.

  6. Bucket Name (and Path) (required): A string representing the bucket name. Enter the bucket name and a path within that bucket. For example, aws-s3-bucket-testing. For more information on how to setup an AWS user and S3 bucket for secure DAS S3 access, see AWS S3 Bucket Access page.

    note
    • If only one file is returned from S3 then the result will contain the content of that file. For example, if the bucket name and path is tests3/test.json the result is {"foo": "bar"}.

    • If multiple files are returned from S3 then the result will have additional layers with the full folder structure and file names to avoid collisions. For example, if the bucket name and path is bucket and path: tests3/data the result is {"data": {"file.json": {"foo": "bar"}}}.

  7. Endpoint: A gateway endpoint. For more information, see AWS S3 Endpoints.

  8. Refresh interval: Enter a refresh interval which is the amount of time between polling intervals. Default is s.

  9. Access Keys for IAM Users: Enter the following access key credentials.

    • Access Key ID (required): Enter the access key ID. For more information, see AWS IAM User Access Keys.

    • Secret Access Key (required): This DAS secret is required if you are using a S3 bucket within your own AWS account.

  10. Click the arrow to expand the Advanced field.

  11. Data transform: Specify a policy and write a query that allows you to apply Rego transformations before it is persisted as data. For example, Select Custom and fill in the following fields:

    • Policy: An existing policy separated by /. For example, transform/transform.rego.

    • Rego query: Path to the Rego rule to evaluate. For example, data.transform.query.

  12. Leave the Enable on-premises data source agent switch OFF.

    Now, make sure you filled all the fields similar to Figure 4.

    Figure 4 - Completed Data Source FormFigure 4 - Completed Data Source Form

  13. Finally, click the Add button to add a data source.

The following shows an example output which appears after the data source is created in DAS.

{
"data": {
"s3-test.json": {
"foo1": "bar1"
},
"s3-test.yaml": {
"foo3": "bar3"
},
"s3-test.yml": {
"foo2": "bar2"
}
}
}