Skip to main content

Export Results to an AWS S3 Bucket

You can set up an export of analytical and list results to an AWS S3 bucket for the following clean room types:

  • Hybrid

  • Confidential Computing (TEE/HCC)

  • Snowflake

Exporting results to an AWS S3 bucket has the overall steps shown below.

Note

  • Partners invited to a clean room must have their export destination connections (grouped under "Destinations") approved by clean room owners. Contact your Customer Success representative to facilitate the approval.

  • If your AWS S3 Bucket is set up with SSE KMS, share the KMS key ARN with your LiveRamp contact. For information about S3's server-side encryption key management service, see this AWS article.

  • Make sure that the bucket path you use for the export destination connection is distinct from any bucket paths you use for existing AWS data connections in LiveRamp Clean Room. For example, to use the same bucket for both exports and data connections, make sure to use a distinct folder in that bucket for exports and a distinct folder for each data connection.

  • For S3 exports originating from a Confidential Computing (TEE/HCC) clean room, notify your LiveRamp contact because LiveRamp needs to add your S3 bucket to the firewall allowlist. If you joined the project mid-configuration and are unsure if your clean room is Confidential Computing and needs this step, you can ask.

  • Configuring exports outside of a clean room (such as at the organization level) is still supported, but will be deprecated. Setting up clean room question exports within a clean room is recommended.

Overall Steps

Perform the following overall steps to set up an export of analytical and user list results to an AWS S3 bucket:

Note

The steps vary depending on the type of clean room in the collaboration (Hybrid and Confidential Computing clean rooms, or Snowflake clean rooms that employ "native" or intra-cloud patterns).

For information on performing these steps, see the sections below.

Enable the Clean Room for Exports

Before setting up an export, the clean room owner must enable exports for the selected source clean room:

  1. From the LiveRamp Clean Room navigation pane, select Clean RoomsClean Rooms (or click Go to Clean Rooms from the Clean Rooms tile).

  2. In the row for the clean room you would like to export from, click the More Options menu (the three dots), and then select Edit.

    CR-Export_to_GCS-edit_button.png
  3. From the Configuration step, click Next Step.

    LCR-Export_Analytical_Results_to_BigQuery-Next_Step_button.png
  4. From the Parameters step, adjust any data control parameters as needed and then slide the Enable Export toggle to the right.

    CR-Export_to_GCS-edit_clean_room_screen.png
  5. Click Next Step.

  6. Verify that your data control parameters are correct and then click Save.

    CR-Export_to_GCS-edit_clean_room_screen_2.png

Add the Credentials

To set up an export to a cloud location, the clean room owner must first add their credentials or their partner's.

The credentials required for exporting to AWS vary depending on the type of clean room in the collaboration:

  • For Hybrid or Confidential Computing clean rooms: Use AWS S3 Credentials.

  • For Snowflake clean rooms: Use IAM User Role credentials.

Note

If you're unsure about the clean room type, contact your LiveRamp representative.

Decision Tree for AWS S3 Exports:

CR-Export_to_AWS-decision_tree.png

Add AWS S3 Credentials

For Hybrid or Confidential Computing clean rooms, add AWS S3 credentials:

  1. From the LiveRamp Clean Room navigation pane, select Data ManagementCredentials.

  2. Click Add Credential.

    add_credential.png
  3. Enter a descriptive name for the credential.

  4. For the Credentials Type, select "AWS S3 Credential".

    CR-Export_to_AWS-AWS_S3_credential.png
  5. In the S3 Bucket Name field, enter your S3 bucket name without the brackets. This is the portion of the S3 path prior to the first slash and without the s3://. For example, using the S3 bucket "s3://example-bucket/uploads/daily/date=yyyy-MM-dd", you would enter "example-bucket".

  6. In the AWS Region field, enter the region where your S3 bucket is provisioned (such as "us-east-1").

  7. Click Save Credential.

  8. From the LiveRamp Clean Room navigation pane, select Data ManagementCredentials.

  9. In the row for your AWS S3 credential, select View Source from the Actions dropdown.

    image idm3400
  10. From the Credential Details screen, copy the Role ARN.

    CR-Export_to_AWS-Role_ARN.png
  11. Apply the below policy to your S3 bucket after modifying the policy in order to grant LiveRamp’s IAM Role access:

    • Paste the Role ARN into the policy where it states "ENTER YOUR ROLE_ARN" between the double quotes.

    • Enter your S3 bucket name without the brackets into both lines of the policy where it says "[ENTER YOUR S3 BUCKET NAME]".

      Note

      This is the portion of the S3 path prior to the first slash and without the "s3://".

    {
    
    "Version": "2012-10-17",
    "Statement": [
       {
          "Sid": "PutAndList",
          "Effect": "Allow",
          "Principal": {
             "AWS": "ENTER YOUR ROLE_ARN"
          },
          "Action": [
              "s3:ListBucket",
                 "s3:GetBucketLocation",
                 "s3:GetObject",
                 "s3:GetObjectVersion",
                 "s3:PutObject",
                 "s3:PutObjectAcl",
                 "s3:DeleteObjectVersion",
                 "s3:DeleteObject"
          ],
          "Resource": [
             "arn:aws:s3:::[ENTER YOUR S3 BUCKET NAME]",
             "arn:aws:s3:::[ENTER YOUR S3 BUCKET NAME]/*"
          ]
       }
    ]
    }

Add AWS IAM User Credentials

For Snowflake clean rooms, add IAM User Role credentials:

  1. From the LiveRamp Clean Room navigation pane, select Data ManagementCredentials.

  2. Click Add Credential.

    add_credential.png
  3. Enter a descriptive name for the credential.

  4. For the Credentials Type, select "AWS IAM User Credentials".

    CR-_Export__to__AWS-_AWS__IAM__credential.png
  5. Enter the following parameters:

    CR-_Export__to__AWS-_AWS__IAM__credential_parameters.png
    • AWS Access Key ID

    • AWS Secret Access Key

    • User ARN

    • AWS Region

  6. Click Save Credential.

  7. Verify that your credentials have been added to LiveRamp Clean Room:

Add an Export Destination Connection

The destination account type required for exporting to AWS varies depending on the type of clean room in the collaboration:

  • For Hybrid or Confidential Computing clean rooms Set up an S3 IAM Role destination account.

  • For Snowflake clean rooms: Set up an S3 export destination account.

Add an S3 IAM Role Destination Account

For Hybrid or Confidential Computing clean rooms, add an S3 IAM Role export destination account:

  1. From the LiveRamp Clean Room navigation pane, select Destinations & IntegrationsDestinations.

  2. Click Create Destination Connection.

    image idm2060
  3. Select S3 IAM Role Export.

    CR-Export_to_AWS-S3_IAM_Role_Export.png
  4. Enter a name and select the AWS S3 credential created in the "Add the Credentials" section above.

    CR-Export_to_AWS-choose_credential.png

    Note

    Leave the Public Key field blank.

  5. Click Add new account.

  6. Confirm that the new export has been added to your list of S3 IAM export destination accounts.

    CR-Export_to_AWS-confirm_export.png

Note

The status of the destination connection will be "Configured" initially, but you can continue to export data. Once the first successful export has been processed the status changes to "Complete".

Add an S3 Destination Account

For Snowflake clean rooms, set up an S3 export destination account:

  1. From the LiveRamp Clean Room navigation pane, select Destinations & IntegrationsDestinations.

  2. Click Create Destination Connection.

    image idm2060
  3. Select S3 Export.

    CR-Export_to_AWS-S3_Export.png
  4. Enter a name and select the AWS IAM credential created in the "Add the Credentials" section above.

    CR-Export_to_AWS-choose_IAM_credential.png
  5. Click Add new account.

  6. Confirm that the new export has been added to your list of S3 export activation channels.

Configure the Question to Share Results

For exports to AWS from a Snowflake clean room, you need to configure the question to share the results from your Snowflake account to LiveRamp:

Note

This step does not have to be performed if you’re exporting a question from a Hybrid or Confidential Computing clean room.

  1. From the LiveRamp Clean Room navigation pane, select Clean Rooms → Clean Rooms.

  2. From the tile for the desired clean room, click Enter.

  3. From the Clean Room navigation pane, select Questions

  4. In the row for the question whose results you want to export, click the More Options menu and then select Share Question Results.

    LCR-Export_AWS-Share_Question_Results_menu_selection.png
  5. From the dialog that appears, make sure that the check box for LiveRamp is checked and then click Share.

    LCR-Export_AWS-Share_button.png

Set Up a Data Export Within the Clean Room

To set up a data export within the clean room:

  1. From the LiveRamp Clean Room navigation pane, select Clean RoomsClean Rooms.

  2. From the tile for the desired clean room, click Enter.

  3. From the Clean Room navigation pane, select Destinations. The Destinations screen shows all destination connections provisioned to the clean room.

  4. Check the check box for the desired destination connection and then click Provision (AWS S3 example shown).

    CR-Export_to_AWS-_provision_Activation_Channels.png
  5. Verify that your destination connection has been added (S3 IAM example shown).

    CR-Export_to_AWS-_verify_Activation_Channels.png

Create a New Export

After you've provisioned the destination connection to the clean room, create a new export:

  1. From the Clean Room navigation pane, select Exports.

    CR-Export_to_AWS-Export_menu_selection.png
  2. Click + New to open the wizard to create a new export.

    CR-Export_to_AWS-New_button.png
  3. Select the question that you want to export outputs for and then click Next.

    CR-Export_to_AWS-select_question.png
  4. Select the specific export destination account you want to send run outputs to, and then click Next.

    CR-Export_to_AWS-select_export_partner.png
  5. Enter the export path to save the results to, and then click Finish:

    CR-Export_to_AWS-define_custom_folder.png
    • For S3 exports (Snowflake clean rooms), enter the full s3://… path.

    • For S3 IAM Role exports (Confidential Computing or Hybrid clean rooms), enter the folder name.

    Note

    Provide only the bucket name for the analytical or user list outputs. Do not include the gs:// prefix before the bucket name.

  6. Verify that the job has been created. Exports are added to the page. You may view the details of an export by clicking on the name.

    CR-Export_to_AWS-details_panel-exports.png
    CR-Export_to_AWS-details_panel-details.png

Note

  • Exports can be paused, which will stop them from sending data upon the completion of each run.

  • Exports cannot be edited or deleted. Changes should be made by pausing the export and creating a new export.

Export Details

When a question runs, the results will be written to the defined S3 bucket. Each row will have an associated "Run ID" column. There will be a second metadata table created.

S3 IAM Role Export (Hybrid or Confidential Computing) File Structure

  • s3://<data-source-location>/<custom-prefix>

    • /date=<yyyy-mm-dd>

      • /cleanroom=<clean-room-id>

        • /question=<clean-room-question-id>

          • /run=<run-id>

            • /data ← For analytical questions, this contains 1 .csv file. For user list questions, this contains 1+ .parquet files with list data.

              Note

              List files do not contain column headers

            • metadata.json ← This contains each column header name, data type, and whether it is encrypted. If encryption is used, the dek and kek are also provided.

S3 Export (Snowflake) File Structure

  • s3://<data-source-location>/<custom-prefix>

    • /yyyy-MM-dd ← Run date

      • /runID ← RunID available in the Clean Room and via API ( /cleanroom-questions/{cleanroomQuestionId}/cleanroom-question-runs )

        • /data ← For analytical questions, this contains 1 .csv file. For user list questions, this contains 1+ .csv files with list data.

          Note

          List files do not contain column headers

          • meta.csv ← Information about the question (title, description, category) and question run (name)

          • headers.csv ← Column headers for the /data files