Skip to main content

Export Results to GCS

You can set up an export of analytics and list results to GCS, which has the overall steps shown below.

Note

  • Partners invited to a clean room must have their export destination connections (grouped under "Destinations") approved by clean room owners. Contact your Customer Success representative to facilitate the approval.

  • The IAM role from LiveRamp Clean Room needs to have write/delete and read access on the customer bucket/folder.

  • Configuring exports outside of a clean room (i.e., at the organization level) is still supported, but will be deprecated. Setting up clean room question exports within a clean room is recommended.

Overall Steps

Perform the following overall steps in Google Cloud Platform to set up an export of analytics and list results to BigQuery:

Once the above steps have been performed in Google Cloud Platform, perform the following overall steps in LiveRamp Clean Room:

For information on performing these steps and for export details, see the sections below.

Perform Steps in Google Cloud Platform

Perform the steps in the sections below in Google Cloud Platform to set up an export of analytics and list results to BigQuery.

Create a Google Service Account

To create a Google service account in Google Cloud Platform:

  1. From the Google Cloud Platform main menu, select IAM & AdminService Accounts.

    hybrid_bq_sa.png
  2. Click CREATE SERVICE ACCOUNT. Save the service account email because you will need it in later steps.

    image idm3666
  3. Enter a name for the service account.

  4. Click Create & Continue.

  5. Configure roles and additional user access as needed.

    image idm3671
  6. Click DONE.

Create a Google Service Account Key

After you've created a Google service account, create a Google service account key:

  1. From the Google Cloud Platform main menu, select IAM & AdminService Accounts.

  2. Select the check box for the service account you created in the previous procedure.

  3. From the More Options menu for the service account's row, select Manage keys.

    image idm3676
  4. Click ADD KEY and then click Create new key.

    image idm3681
  5. For the key type, select JSON and then click CREATE.

    image idm3686

The private key will be stored in your Download folder (it will look similar to the following example). Save this for use in the “Add the Credentials” section below.

image idm3691

Grant the Service Account Permissions to Access Bucket Objects

To grant bucket objects permissions to the service account:

  1. From the Google Cloud Platform console, log in as a Project Editor.

  2. From the home dashboard, select IAM & AdminRoles.

    image idm3696
  3. Select CREATE ROLE.

  4. Enter a title, description, and ID for the custom role.

  5. Select Add permissions.

  6. Enter "Storage Admin" in the filter.

  7. Add the following permissions:

    • storage.managedFolders.create

    • storage.managedFolders.delete

    • storage.managedFolders.get

    • storage.managedFolders.list

    • storage.multipartUploads.abort

    • storage.multipartUploads.create

    • storage.multipartUploads.list

    • storage.multipartUploads.listParts

    • storage.objects.create

    • storage.objects.delete

    • storage.objects.get

    • storage.objects.getIamPolicy

    • storage.objects.list

    • storage.objects.overrideUnlockedRetention

    • storage.objects.restore

    • storage.objects.setIamPolicy

    • storage.objects.setRetention

    • Storage.objects.update

  8. Click CREATE.

Note

If you don’t want to set up a custom role with the above permissions you can also assign the service account with the Storage Object Admin role. Instructions on assigning a role to the service account can be found below.

Assign the Custom Role to the Cloud Storage Service Account

To assign the custom role to the cloud storage service account:

  1. From the Google Cloud Platform console, log in as a Project Editor.

  2. Search for "Cloud Storage".

  3. From the home dashboard, select StorageBrowser.

  4. Click the bucket name for the bucket you want to configure for access.

  5. Select SHOW INFO PANEL in the upper-right corner. The information panel for the bucket displays.

  6. From the information panel, click ADD PRINCIPAL.

    image idm3701
  7. In the Add members field, add the service account.

  8. From the Select a role dropdown, select Customrole (where "role" is the custom Cloud Storage role you created in the previous procedure).

    assign_custom_role_to_bcket2.png
  9. Click SAVE.

  10. Confirm the custom role and service account are now associated with the bucket.

    saved_role2.png

Perform Steps in LiveRamp Clean Room

Once the above steps have been performed in Google Cloud Platform, perform the overall steps in the sections below in LiveRamp Clean Room.

Enable the Clean Room for Exports

Before setting up an export, the clean room owner must enable export for the selected source clean room:

  1. From the LiveRamp Clean Room navigation pane, select Clean RoomsClean Rooms (or click Go to Clean Rooms from the Clean Rooms tile).

  2. In the row for the clean room you would like to export from, click the More Options menu (the three dots), and then select Edit.

    CR-Export_to_GCS-edit_button.png
  3. From the Configuration step, click Next Step.

    LCR-Export_Analytical_Results_to_BigQuery-Next_Step_button.png
  4. From the Parameters step, adjust any data control parameters as needed and then slide the Enable Export toggle to the right.

    CR-Export_to_GCS-edit_clean_room_screen.png
  5. Click Next Step.

  6. Verify that your data control parameters are correct and then click Save.

    CR-Export_to_GCS-edit_clean_room_screen_2.png

Add the Credentials

To set up an export to a cloud location, the clean room owner must first add either their own credentials or those of their partner:

  1. From the LiveRamp Clean Room navigation pane, select Data ManagementCredentials.

  2. Click Add Credential.

    add_credential.png
  3. Enter a descriptive name for the credential.

  4. For the Credentials Type, select "Google Service Account".

    CR-Export_to_GCS-edit_credential.png
  5. For the Project ID, enter the project ID.

  6. Enter your Credential JSON (credential JSON is hidden by default, viewable here for demonstration purposes).

    CR-Export_to_GCS-_credential_JSON.png
  7. Click Save Credential.

  8. Verify that your credentials have been added to LiveRamp Clean Room:

    CR-Export_to_GCS-_verify_credential.png

Add an Export Destination Connection

To add an export destination connection:

  1. From the LiveRamp Clean Room navigation pane, select Destinations & IntegrationsDestinations.

  2. Click Create Destination Account.

    image idm2060
  3. Select GCS Export.

    CR-Export_to_GCS-_GCS_tile.png
  4. Enter a name and select the GCS Credential created in the "Add the Credentials" section above.

    CR-Export_to_GCS-_choose_credential.png

    Note

    Leave the Public Key field blank.

  5. Click Add new account.

  6. Confirm that the new export has been added to your list of GCS export destination accounts.

    CR-_Export__to__GCS-___confirm_export.png

Note

The status of the destination connection will be "Pending" initially, but you can continue to export data. Once the first successful export has been processed the status changes to "Complete".

Set Up a Data Export Within the Clean Room

To set up a data export within the clean room:

  1. From the LiveRamp Clean Room navigation pane, select Clean RoomsClean Rooms.

  2. From the tile for the desired clean room, click Enter.

  3. From the Clean Room navigation pane, select Destinations. The Destinations screen shows all destination connections provisioned to the clean room.

  4. Check the check box for the desired destination connection and then click Provision (AWS S3 example shown).

    CR-Export_to_AWS-_provision_Activation_Channels.png
  5. Verify that your destination connection has been added (S3 IAM example shown).

    CR-Export_to_AWS-_verify_Activation_Channels.png

Create a New Export

To create a new export:

  1. From the LiveRamp Clean Room navigation pane, select Clean RoomsClean Rooms.

  2. From the tile for the desired clean room, click Enter.

  3. From the Clean Room navigation pane, select Exports.

  4. Click + New to open the wizard to create a new export.

    CR-Export_to_GCS-__Exports_page.png
  5. Select the question that you want to export outputs for and then click Next.

    CR-Export_to_GCS-_choose_clean_room_question.png
  6. Check the radio button for the specific GCS destination account you want to send run outputs to.

    CR-Export_to_GCS-_choose_export_channel.png
  7. Enter the GCS export path to save the results to and then click Finish.

    CR-Export_to_GCS-_configure_folder.png

    Note

    Provide only the bucket name for the analytical or user list outputs. Do not include the gs:// prefix before the bucket name.

  8. Verify that the job has been created. Exports are added to the page. You may view the details of an export by clicking on the name.

    CR-Export_to_GCS-_verify_job.png
    CR-Export_to_GCS-_verify_job_2.png

Note

  • Exports can be paused, which will stop them from sending data upon the completion of each run.

  • Exports cannot be edited or deleted. Changes should be made by pausing the export and creating a new export.

Export Details

When a question runs, the results will be written to the defined bucket. Each row will have an associated "Run ID" column. There will be a second metadata table created. Users can join on Run ID to get run metadata (Run Name, Runtime Parameters, etc.).

GCS Export File Structure

  • gs://<data-source-location>/<custom-prefix>

    • /date=<yyyy-mm-dd>

      • /cleanroom=<clean-room-id>

        • /question=<question-id>

          • /run=<run-id>

            • /data

              For Analytical Questions, this contains 1 .csv file. For List Questions, this contains 1+ .parquet files with List data.

              Note

              List files do not contain column headers.

            • metadata.json

              This contains each column header name, data type, and whether it is encrypted. If encryption is used, the dek and kek are also provided.