Skip to main content

Amazon S3 Bundle Data Source

An Amazon S3 Bundle data sources are similar to the Bundle Git data sources. Instead of automatically reading the JSON out of Git, it uploads a bundle stored in a Amazon S3 bucket to Styra DAS and compiles it. The bundle can contain one or more .rego files structured in folders and a data.json file with some constant data that can be used by Rego bundle.

See Amazon S3 Bucket Access for Amazon S3 setup information.

Configure through the Styra DAS UI

The following section helps you to configure <das-id> to access a data source stored in Amazon S3 using the Styra DAS UI.

Create a Styra DAS System

Go to <das-id> To add a new System, click the ( ⨁ ) plus icon next to SYSTEMS on the left side of the navigation panel.

Fill in the following fields:

  • System type (required): Select any System type from the drop down list. For example, Custom.

  • System name (required): A user-friendly name so that you can distinguish between the different systems.

  • Description (optional): More details about this system.

  • Leave the Show errors switch ON to display the errors.

  • Click Add system button.

The Styra DAS System is created under the SYSTEMS on the left side of the navigation panel.

Add a Data Source

After you create your system, click the three dots () next to it and select Add Data Source to start configuring the data source.

Figure 1 - Add Data SourceFigure 1 - Add Data Source

Now, your Custom System >> Add Data Source dialog appears.

Figure 2 - Add Data Source WindowFigure 2 - Add Data Source Window

Complete the following steps in your Custom System >> Add Data Source dialog box.

  1. Type: An editable data source that you fill in with JSON data and publish. Click the down arrow to select the data source type. For example, select AWS S3 for bundle import to pull from AWS S3 OPA bundles on an interval. This refreshes regularly.

    Figure 3 - Data Source TypeFigure 3 - Data Source Type

  2. Path: Enter a new or existing path separated by /. For example, am/datasourcetypes.

  3. Data source name (required): Enter a name for the data source type. For example, am-amazon-s3.

  4. Description: This field is optional.

  5. AWS region (required): A string representing the AWS region. Select one of the regions from AWS service Endpoints. For example, us-east-1.

  6. Bucket Name (and Path) (required): A string representing the bucket name. Enter the bucket name and a path within that bucket. For example, aws-s3-bucket-testing. For more information on how to setup an AWS user and Amazon S3 bucket for secure DAS S3 access, see Amazon S3 Bucket Access page.

  7. Endpoint override: A gateway endpoint. For more information, see Amazon S3 Endpoints.

  8. Refresh interval: Enter a refresh interval which is the amount of time between polling intervals. Default is s.

  9. Access Keys for IAM Users: Enter the following access key credentials.

    • Access Key ID (required): Enter the access key ID. For more information, see AWS IAM User Access Keys.

    • Secret Access Key (required): This DAS secret is required if you are using a Amazon S3 bucket within your own AWS account.

  10. Click the arrow to expand the Advanced field.

  11. Leave the Enable on-premises data source agent switch OFF.

    Now, make sure you filled all the fields similar to Figure 4.

    Figure 4 - Completed Data Source FormFigure 4 - Completed Data Source Form

  12. Finally, click the Add button to add a data source.

The following shows an example output which appears after the data source is created in DAS.

"_data": {},
"_packages": {
"data.datasourcetypes.bundle_s3": {
"a.rego": {}
"_signatures": null,
"allow": false