Export Results to an AWS S3 Bucket
You can set up an export of analytical and list results to an AWS S3 Bucket, which has the overall steps shown below.
Note
Partners invited to a clean room must have their export destination connections (grouped under "Destinations") approved by clean room owners. Contact your Customer Success representative to facilitate the approval.
If your AWS S3 Bucket is set up with SSE KMS, share the KMS key ARN with your LiveRamp contact.
For S3 exports originating from a TEE/HCC clean rooms, notify your LiveRamp contact as LiveRamp needs to add your S3 bucket to the firewall allowlist. If you joined the project mid-configuration and are unsure if your clean room is TEE/HCC and needs this step, always feel free to ask.
Configuring exports outside of a clean room (i.e., at the organization level) is still supported, but will be deprecated. Setting up clean room question exports within a clean room is recommended.
Overall Steps
Perform the following overall steps set up an export of analytics and list results to an AWS S3 bucket:
Note
The steps vary depending on the type of clean room in the collaboration (Hybrid and TEE clean rooms or all other clean room types that employ "native" or intra-cloud patterns (such as Snowflake, Databricks, BigQuery, and AWS clean rooms)).
For information on performing these steps, see the sections below.
Enable the Clean Room for Exports
Before setting up an export, the clean room owner must enable export for the selected source clean room:
From the LiveRamp Clean Room navigation pane, select Clean Rooms → Clean Rooms (or click Go to Clean Rooms from the Clean Rooms tile).
In the row for the clean room you would like to export from, click the More Options menu (the three dots), and then select Edit.
From the Configuration step, click
.From the Parameters step, adjust any data control parameters as needed and then slide the Enable Export toggle to the right.
Click
.Verify that your data control parameters are correct and then click
.
Add the Credentials
To set up an export to a cloud location, the clean room owner must first add either their own credentials or those of their partner.
The credentials required for exporting to AWS vary depending on the type of clean room in the collaboration:
For Hybrid or TEE clean rooms: Use AWS S3 Credentials.
For all other clean room types that employ "native" or intra-cloud patterns (such as Snowflake, Databricks, BigQuery, and AWS clean rooms): Use IAM User Role credentials.
Note
If you're unsure about the clean room type, contact your LiveRamp representative.
Decision Tree for AWS S3 Exports:

Add AWS S3 Credentials
For Hybrid or TEE clean rooms, add AWS S3 credentials:
From the LiveRamp Clean Room navigation pane, select Data Management → Credentials.
Click
.Enter a descriptive name for the credential.
For the Credentials Type, select "AWS S3 Credential".
In the S3 Bucket Name field, enter your S3 bucket name without the brackets. This is the portion of the S3 path prior to the first slash and without the s3://. For example, using the S3 bucket "s3://example-bucket/uploads/daily/{yyyy-MM-dd}/full", you would enter "example-bucket".
In the AWS Region field, enter the region where your S3 bucket is provisioned (such as "us-east-1").
Click
.From the LiveRamp Clean Room navigation pane, select Data Management → Credentials.
In the row for your AWS S3 credential, select View Source from the Actions dropdown.
From the Credential Details screen, copy the Role ARN.
Apply the below policy to your S3 bucket after modifying the policy in order to grant LiveRamp’s IAM Role access:
Paste the Role ARN into the policy where it states "ENTER YOUR ROLE_ARN" between the double quotes.
Enter your S3 bucket name without the brackets into both lines of the policy where it says "[ENTER YOUR S3 BUCKET NAME]".
Note
This is the portion of the S3 path prior to the first slash and without the "s3://".
{ "Version": "2012-10-17", "Statement": [ { "Sid": "PutAndList", "Effect": "Allow", "Principal": { "AWS": "ENTER YOUR ROLE_ARN" }, "Action": [ "s3:ListBucket", "s3:GetBucketLocation", "s3:GetObject", "s3:GetObjectVersion", "s3:PutObject", "s3:PutObjectAcl", "s3:DeleteObjectVersion", "s3:DeleteObject" ], "Resource": [ "arn:aws:s3:::[ENTER YOUR S3 BUCKET NAME]", "arn:aws:s3:::[ENTER YOUR S3 BUCKET NAME]/*" ] } ] }
Add AWS IAM User Credentials
For all clean room types that employ "native" or intra-cloud patterns (such as Snowflake, Databricks, Big Query, and AWS clean rooms), add IAM User Role credentials:
From the LiveRamp Clean Room navigation pane, select Data Management → Credentials.
Click
.Enter a descriptive name for the credential.
For the Credentials Type, select "AWS IAM User Credentials".
Enter the following parameters:
AWS Access Key ID
AWS Secret Access Key
User ARN
AWS Region
Click
.Verify that your credentials have been added to LiveRamp Clean Room:
Add an Export Destination Connection
The destination account type required for exporting to AWS varies depending on the type of clean room in the collaboration:
For Hybrid or TEE clean rooms: Set up an S3 IAM Role destination account.
For all other clean room types that employ "native" or intra-cloud patterns (such as Snowflake, Databricks, Big Query, and AWS clean rooms): Set up an S3 export destination account.
Add an S3 IAM Role Destination Account
For Hybrid or TEE clean rooms, add an S3 IAM Role export destination account:
From the LiveRamp Clean Room navigation pane, select Destinations & Integrations → Destinations.
Click
.Select S3 IAM Role Export.
Enter a name and select the AWS S3 credential created in the "Add the Credentials" section above.
Note
Leave the Public Key field blank.
Click
.Confirm that the new export has been added to your list of S3 IAM export destination accounts.
Note
The status of the destination connection will be "Pending" initially, but you can continue to export data. Once the first successful export has been processed the status changes to "Complete".
Add an S3 Destination Account
For all clean room types that employ “native” or intra-cloud patterns (such as Snowflake, Databricks, Big Query, and AWS clean rooms), set up an S3 export destination account:
From the LiveRamp Clean Room navigation pane, select Destinations & Integrations → Destinations.
Click
.Select S3 Export.
Enter a name and select the AWS IAM credential created in the "Add the Credentials" section above.
Click
.Confirm that the new export has been added to your list of S3 export activation channels.
Set Up a Data Export Within the Clean Room
To set up a data export within the clean room:
From the LiveRamp Clean Room navigation pane, select Clean Rooms → Clean Rooms.
From the tile for the desired clean room, click
.From the Clean Room navigation pane, select Destinations. The Destinations screen shows all destination connections provisioned to the clean room.
Check the check box for the desired destination connection and then click
(AWS S3 example shown).Verify that your destination connection has been added (S3 IAM example shown).
Create a New Export
To create a new export:
From the Clean Room navigation pane, select Exports.
Click
to open the wizard to create a new export.Select the question that you want to export outputs for and then click
.Check the radio button for the specific export destination account you want to send run outputs to and then click
.Enter the export path to save the results to and then click
:For S3 exports (native-pattern clean rooms), enter the full s3://… path.
For S3 IAM Role exports (TEE or Hybrid clean rooms), enter the folder name.
Note
Provide only the bucket name for the analytical or user list outputs. Do not include the
gs://
prefix before the bucket name.Verify that the job has been created. Exports are added to the page. You may view the details of an export by clicking on the name.
Note
Exports can be paused, which will stop them from sending data upon the completion of each run.
Exports cannot be edited or deleted. Changes should be made by pausing the export and creating a new export.
Export Details
When a question runs, the results will be written to the defined S3 bucket. Each row will have an associated "Run ID" column. There will be a second metadata table created.
S3 IAM Role Export (Hybrid or TEE) File Structure
s3://<data-source-location>/<custom-prefix>
/date=<yyyy-mm-dd>
/cleanroom=<clean-room-id>
/question=<question-id>
/run=<run-id>
/data ← For Analytical Questions, this contains 1 .csv file. For List Questions, this contains 1+ .parquet files with List data.
Note
List files do not contain column headers
metadata.json ← This contains each column header name, data type, and whether it is encrypted. If encryption is used, the dek and kek are also provided
S3 Export (non-Hybrid, non-TEE) File Structure
s3://<data-source-location>/<custom-prefix>
/yyyy-MM-dd ← Run Date
/runID ← RunID available in the Clean Room and via API ( /cleanroom-questions/{cleanroomQuestionId}/cleanroom-question-runs )
/data ← For Analytical Questions, this contains 1 .csv file. For List Questions, this contains 1+ .csv files with List data.
Note
List files do not contain column headers
meta.csv ← Information about the Question (Title, Description, Category) and Question Run (Name)
headers.csv ← Column headers for the /data files