Skip to main content

Export Results to an AWS S3 Bucket

You can set up an export of analytical and list results to an AWS S3 Bucket, which has the overall steps shown below.

Note

  • Partners invited to a clean room must have their export destination connections (grouped under "Destinations") approved by clean room owners. Contact your Customer Success representative to facilitate the approval.

  • If your AWS S3 Bucket is set up with SSE KMS, share the KMS key ARN with your LiveRamp contact.

  • For S3 exports originating from a TEE/HCC clean rooms, notify your LiveRamp contact as LiveRamp needs to add your S3 bucket to the firewall allowlist. If you joined the project mid-configuration and are unsure if your clean room is TEE/HCC and needs this step, always feel free to ask.

  • Configuring exports outside of a clean room (i.e., at the organization level) is still supported, but will be deprecated. Setting up clean room question exports within a clean room is recommended.

Overall Steps

Perform the following overall steps set up an export of analytics and list results to an AWS S3 bucket:

Note

The steps vary depending on the type of clean room in the collaboration (Hybrid and TEE clean rooms or all other clean room types that employ "native" or intra-cloud patterns (such as Snowflake, Databricks, BigQuery, and AWS clean rooms)).

For information on performing these steps, see the sections below.

Enable the Clean Room for Exports

Before setting up an export, the clean room owner must enable export for the selected source clean room:

  1. From the LiveRamp Clean Room navigation pane, select Clean RoomsClean Rooms (or click Go to Clean Rooms from the Clean Rooms tile).

  2. In the row for the clean room you would like to export from, click the More Options menu (the three dots), and then select Edit.

    CR-Export_to_GCS-edit_button.png
  3. From the Configuration step, click Next Step.

    LCR-Export_Analytical_Results_to_BigQuery-Next_Step_button.png
  4. From the Parameters step, adjust any data control parameters as needed and then slide the Enable Export toggle to the right.

    CR-Export_to_GCS-edit_clean_room_screen.png
  5. Click Next Step.

  6. Verify that your data control parameters are correct and then click Save.

    CR-Export_to_GCS-edit_clean_room_screen_2.png

Add the Credentials

To set up an export to a cloud location, the clean room owner must first add either their own credentials or those of their partner.

The credentials required for exporting to AWS vary depending on the type of clean room in the collaboration:

  • For Hybrid or TEE clean rooms: Use AWS S3 Credentials.

  • For all other clean room types that employ "native" or intra-cloud patterns (such as Snowflake, Databricks, BigQuery, and AWS clean rooms): Use IAM User Role credentials.

Note

If you're unsure about the clean room type, contact your LiveRamp representative.

Decision Tree for AWS S3 Exports:

CR-Export_to_AWS-decision_tree.png

Add AWS S3 Credentials

For Hybrid or TEE clean rooms, add AWS S3 credentials:

  1. From the LiveRamp Clean Room navigation pane, select Data ManagementCredentials.

  2. Click Add Credential.

    add_credential.png
  3. Enter a descriptive name for the credential.

  4. For the Credentials Type, select "AWS S3 Credential".

    CR-Export_to_AWS-AWS_S3_credential.png
  5. In the S3 Bucket Name field, enter your S3 bucket name without the brackets. This is the portion of the S3 path prior to the first slash and without the s3://. For example, using the S3 bucket "s3://example-bucket/uploads/daily/{yyyy-MM-dd}/full", you would enter "example-bucket".

  6. In the AWS Region field, enter the region where your S3 bucket is provisioned (such as "us-east-1").

  7. Click Save Credential.

  8. From the LiveRamp Clean Room navigation pane, select Data ManagementCredentials.

  9. In the row for your AWS S3 credential, select View Source from the Actions dropdown.

    image idm3400
  10. From the Credential Details screen, copy the Role ARN.

    CR-Export_to_AWS-Role_ARN.png
  11. Apply the below policy to your S3 bucket after modifying the policy in order to grant LiveRamp’s IAM Role access:

    • Paste the Role ARN into the policy where it states "ENTER YOUR ROLE_ARN" between the double quotes.

    • Enter your S3 bucket name without the brackets into both lines of the policy where it says "[ENTER YOUR S3 BUCKET NAME]".

      Note

      This is the portion of the S3 path prior to the first slash and without the "s3://".

    {
    
    "Version": "2012-10-17",
    "Statement": [
       {
          "Sid": "PutAndList",
          "Effect": "Allow",
          "Principal": {
             "AWS": "ENTER YOUR ROLE_ARN"
          },
          "Action": [
              "s3:ListBucket",
                 "s3:GetBucketLocation",
                 "s3:GetObject",
                 "s3:GetObjectVersion",
                 "s3:PutObject",
                 "s3:PutObjectAcl",
                 "s3:DeleteObjectVersion",
                 "s3:DeleteObject"
          ],
          "Resource": [
             "arn:aws:s3:::[ENTER YOUR S3 BUCKET NAME]",
             "arn:aws:s3:::[ENTER YOUR S3 BUCKET NAME]/*"
          ]
       }
    ]
    }

Add AWS IAM User Credentials

For all clean room types that employ "native" or intra-cloud patterns (such as Snowflake, Databricks, Big Query, and AWS clean rooms), add IAM User Role credentials:

  1. From the LiveRamp Clean Room navigation pane, select Data ManagementCredentials.

  2. Click Add Credential.

    add_credential.png
  3. Enter a descriptive name for the credential.

  4. For the Credentials Type, select "AWS IAM User Credentials".

    CR-_Export__to__AWS-_AWS__IAM__credential.png
  5. Enter the following parameters:

    CR-_Export__to__AWS-_AWS__IAM__credential_parameters.png
    • AWS Access Key ID

    • AWS Secret Access Key

    • User ARN

    • AWS Region

  6. Click Save Credential.

  7. Verify that your credentials have been added to LiveRamp Clean Room:

Add an Export Destination Connection

The destination account type required for exporting to AWS varies depending on the type of clean room in the collaboration:

  • For Hybrid or TEE clean rooms: Set up an S3 IAM Role destination account.

  • For all other clean room types that employ "native" or intra-cloud patterns (such as Snowflake, Databricks, Big Query, and AWS clean rooms): Set up an S3 export destination account.

Add an S3 IAM Role Destination Account

For Hybrid or TEE clean rooms, add an S3 IAM Role export destination account:

  1. From the LiveRamp Clean Room navigation pane, select Destinations & IntegrationsDestinations.

  2. Click Create Destination Account.

    image idm2060
  3. Select S3 IAM Role Export.

    CR-Export_to_AWS-S3_IAM_Role_Export.png
  4. Enter a name and select the AWS S3 credential created in the "Add the Credentials" section above.

    CR-Export_to_AWS-choose_credential.png

    Note

    Leave the Public Key field blank.

  5. Click Add new account.

  6. Confirm that the new export has been added to your list of S3 IAM export destination accounts.

    CR-Export_to_AWS-confirm_export.png

Note

The status of the destination connection will be "Pending" initially, but you can continue to export data. Once the first successful export has been processed the status changes to "Complete".

Add an S3 Destination Account

For all clean room types that employ “native” or intra-cloud patterns (such as Snowflake, Databricks, Big Query, and AWS clean rooms), set up an S3 export destination account:

  1. From the LiveRamp Clean Room navigation pane, select Destinations & IntegrationsDestinations.

  2. Click Create Destination Account.

    image idm2060
  3. Select S3 Export.

    CR-Export_to_AWS-S3_Export.png
  4. Enter a name and select the AWS IAM credential created in the "Add the Credentials" section above.

    CR-Export_to_AWS-choose_IAM_credential.png
  5. Click Add new account.

  6. Confirm that the new export has been added to your list of S3 export activation channels.

Set Up a Data Export Within the Clean Room

To set up a data export within the clean room:

  1. From the LiveRamp Clean Room navigation pane, select Clean RoomsClean Rooms.

  2. From the tile for the desired clean room, click Enter.

  3. From the Clean Room navigation pane, select Destinations. The Destinations screen shows all destination connections provisioned to the clean room.

  4. Check the check box for the desired destination connection and then click Provision (AWS S3 example shown).

    CR-Export_to_AWS-_provision_Activation_Channels.png
  5. Verify that your destination connection has been added (S3 IAM example shown).

    CR-Export_to_AWS-_verify_Activation_Channels.png

Create a New Export

To create a new export:

  1. From the Clean Room navigation pane, select Exports.

    CR-Export_to_AWS-Export_menu_selection.png
  2. Click + New to open the wizard to create a new export.

    CR-Export_to_AWS-New_button.png
  3. Select the question that you want to export outputs for and then click Next.

    CR-Export_to_AWS-select_question.png
  4. Check the radio button for the specific export destination account you want to send run outputs to and then click Next.

    CR-Export_to_AWS-select_export_partner.png
  5. Enter the export path to save the results to and then click Finish:

    CR-Export_to_AWS-define_custom_folder.png
    • For S3 exports (native-pattern clean rooms), enter the full s3://… path.

    • For S3 IAM Role exports (TEE or Hybrid clean rooms), enter the folder name.

    Note

    Provide only the bucket name for the analytical or user list outputs. Do not include the gs:// prefix before the bucket name.

  6. Verify that the job has been created. Exports are added to the page. You may view the details of an export by clicking on the name.

    CR-Export_to_AWS-details_panel-exports.png
    CR-Export_to_AWS-details_panel-details.png

Note

  • Exports can be paused, which will stop them from sending data upon the completion of each run.

  • Exports cannot be edited or deleted. Changes should be made by pausing the export and creating a new export.

Export Details

When a question runs, the results will be written to the defined S3 bucket. Each row will have an associated "Run ID" column. There will be a second metadata table created.

S3 IAM Role Export (Hybrid or TEE) File Structure

  • s3://<data-source-location>/<custom-prefix>

    • /date=<yyyy-mm-dd>

      • /cleanroom=<clean-room-id>

        • /question=<question-id>

          • /run=<run-id>

            • /data ← For Analytical Questions, this contains 1 .csv file. For List Questions, this contains 1+ .parquet files with List data.

              Note

              List files do not contain column headers

            • metadata.json ← This contains each column header name, data type, and whether it is encrypted. If encryption is used, the dek and kek are also provided

S3 Export (non-Hybrid, non-TEE) File Structure

  • s3://<data-source-location>/<custom-prefix>

    • /yyyy-MM-dd ← Run Date

      • /runID ← RunID available in the Clean Room and via API ( /cleanroom-questions/{cleanroomQuestionId}/cleanroom-question-runs )

        • /data ← For Analytical Questions, this contains 1 .csv file. For List Questions, this contains 1+ .csv files with List data.

          Note

          List files do not contain column headers

          • meta.csv ← Information about the Question (Title, Description, Category) and Question Run (Name)

          • headers.csv ← Column headers for the /data files