Skip to main content

Configure a Google Cloud Storage Data Connection (Customer-Hosted)

If you have data in Google Cloud Storage (GCS) and want to be able to use that data in questions in LiveRamp Clean Room, you can create a Google Cloud Storage data connection.

Note

You can connect GCS to LiveRamp Clean Room using a LiveRamp-hosted GCS instance instead of using your own. For more information, see "Configure a Google Cloud Storage Data Connection (LiveRamp-Hosted)".

A customer-hosted Google Cloud Storage data connection can be used in the following clean room types:

  • Hybrid

  • Confidential Computing

  • BIgQuery

After you’ve created the data connection and Clean Room has validated the connection by connecting to the data in your cloud account, you will then need to map the fields before the data connection is ready to use. This is where you specify which fields can be queryable across any clean rooms, which fields contain identifiers to be used in matching, and any columns by which you wish to partition the dataset for questions.

After fields have been mapped, you’re ready to provision the resulting dataset to your desired clean rooms. Within each clean room, you’ll be able to set dataset analysis rules, exclude or include columns, filter for specific values, and set permission levels.

To configure a customer-hosted Google Cloud Storage (GCS) data connection, see the instructions below.

Overall Steps

Perform the following overall steps in Google Cloud Platform to configure a customer-hosted GCS data connection:

Once the above steps have been performed in Google Cloud Platform, perform the following overall steps in LiveRamp Clean Room:

For information on performing these steps, see the sections below.

Perform Steps in Google Cloud Platform

Perform the steps in the sections below in Google Cloud Platform to configure a customer-hosted GCS data connection.

Create a Google Service Account

To create a Google service account in GCP:

  1. From GCP's main menu, select IAM & AdminService Accounts.

  2. Click CREATE SERVICE ACCOUNT. Save the service account email because you will need it in later steps.

    image idm3666
  3. Enter a name for the service account.

  4. Click Create & Continue.

  5. Configure roles and additional user access as needed.

    image idm3671
  6. Click DONE.

Create a Google Service Account Key

After you've created a Google service account, create a Google service account key:

  1. From GCP's main menu, select IAM & AdminService Accounts.

  2. Select the check box for the service account you created in the previous procedure.

  3. From the More Options menu for the service account's row, select Manage keys.

    image idm3676
  4. Click ADD KEY.

  5. For the key type, select JSON and then click CREATE.

    image idm3686
  6. The private key will be stored in your Download folder (it will look similar to the following example). Save this for use in the "Add the Credentials in LiveRamp Clean Room" section below.

    image idm3691

Grant the Service Account Permissions to Access Bucket Objects

To grant bucket objects permissions to the service account:

Note

If you haven’t already created a bucket, create a bucket by following these Google instructions.

  1. From GCP's main menu, select IAM & AdminService Accounts.

    image idm3696
  2. Select CREATE ROLE.

  3. Enter a title, description, and ID for the custom role.

  4. Select Add permissions.

  5. Enter "Storage Admin" in the filter.

  6. Add the following permissions:

    • storage.buckets.get

    • storage.objects.get

    • storage.objects.list

  7. Click CREATE.

Assign the Custom Role to the Cloud Storage Service Account

To assign the custom role to the cloud storage service account:

  1. From the Google Cloud Platform console, search for "Cloud Storage".

  2. From the navigation menu, select Buckets.

  3. Click the bucket name for the bucket you want to configure for access.

  4. Select the Permissions tab.

  5. Click Grant Access.

  6. Select SHOW INFO PANEL in the upper-right corner. The information panel for the bucket displays.

  7. From the information panel, click ADD PRINCIPAL.

    image idm3701
  8. In the Add members field, add the service account.

  9. From the Select a role dropdown, select Customrole (where "role" is the custom Cloud Storage role you created in the previous procedure).

    assign_custom_role_to_bcket2.png
  10. Click SAVE.

  11. Confirm the custom role and service account are now associated with the bucket.

    saved_role2.png

Capture the Data Location

During the process of creating the data connection, you will need to enter the data location in the form of the GCS bucket file path.

  1. From the Google Cloud Platform console, search for "Cloud Storage".

  2. From the navigation menu, select Buckets.

  3. Select the Objects tab.

  4. From the More Options menu (the three dots) in the row for the bucket, select Copy gsutil URL.

    CR-GCS_Data_Connection-copy_file_path.png
  5. Save the bucket file path for use in the “Create the Data Connection” section below.

Perform Steps in LiveRamp Clean Room

Once the above steps have been performed in Google Cloud Platform, perform the overall steps in the sections below in LiveRamp Clean Room.

Note

if your cloud security limits access to only approved IP addresses, talk to your LiveRamp representative before creating the data connection to coordinate any necessary allowlisting of LiveRamp IP addresses.

Add the Credentials in LiveRamp Clean Room

To add credentials:

  1. From the LiveRamp Clean Room navigation pane, select Data ManagementCredentials.

  2. Click Add Credential.

    add_credential.png
  3. Enter a descriptive name for the credential.

  4. For the Credentials Type, select "Google Service Account".

  5. For the Project ID, enter the project ID.

  6. Enter the Credential JSON you stored in the "Create a Google Service Account Key" procedure above.

  7. Click Save Credential.

    mceclip0.png

Create the Data Connection

To create the data connection:

  1. From the LiveRamp Clean Room navigation pane, select Data ManagementData Connections.

  2. From the Data Connections page, click New Data Connection.

    data_cxn_new.png
  3. From the New Data Connection screen, select "Google Cloud Storage (with SA)".

    Screenshot 2024-03-28 at 19.51.21.png
  4. Select the credentials created in the previous procedure from the list.

  5. Complete the following fields in the Set up Data Connection section:

    Screenshot 2024-03-28 at 12.31.38.png
    • To use partitioning on the dataset associated with the data connection, slide the Uses Partitioning toggle to the right.

    • Category: Enter a category of your choice.

    • Dataset Type: Select Generic.

    • File Format: Select CSV, Parquet, or Delta.

      Note

      • All files must have a header in the first row. Headers should not have any spaces or special characters and should not exceed 50 characters. An underscore can be used in place of a space.

      • If you are uploading a CSV file, avoid double quotes in your data (such as "First Name" or "Country").

    • Quote Character: If you're uploading CSV files, enter the quote character you'll be using (if any).

    • Field Delimiter: If you're uploading CSV files, select the delimiter to use (comma, semicolon, pipe, or tab).

  6. Complete the following tasks and fields in the Data Location and Schema section:

    • To use partitioning on the dataset associated with the data connection, slide the Uses Partitioning toggle to the right.

      Note

      If the data connection uses partitioning, the dataset can be divided into subsets so that data processing occurs only on relevant data during question runs, which results in faster processing times. When using partitioning, a data schema reference file is required to be entered below.

    • Data Location: Enter the GCS bucket location captured in the “Capture the Data Location” section above. For example, "gs://clean-room-client-org-123ab456-7d89-10e1-a234-567b891c0123/purchase_events/date=yyyy-MM-dd/".

      Note

      • The data location must start with "gs://" and end with a forward slash ("/").

      • Make sure that the bucket path you use for the data connection is distinct from any bucket paths you use for existing GCS export destination connections in LiveRamp Clean Room. For example, to use the same bucket for both exports and data connections, make sure to use a distinct folder in that bucket for exports and a distinct folder for each data connection.

    • Sample File Path: If you enabled partitioning above, enter the location of a data schema reference file.

      Note

      • The data schema reference file name must start with "gs://" and end with a valid file extension (such as ".csv").

      • The data schema reference file must be hosted in a static location and must have been uploaded within the last seven days.

  7. Review the data connection details and click Save Data Connection.

    All configured data connections can be seen on the Data Connections page.

  8. If you haven't already, upload your data files to your specified location.

When a connection is initially configured, it will show "Verifying Access" as the configuration status. Once the connection is confirmed and the status has changed to "Mapping Required", map the table's fields.

You will receive file processing notifications via email.

Map the Fields

Once the above steps have been performed in Google Cloud Platform, perform the overall steps in the sections below in LiveRamp Clean Room.

Note

Before mapping the fields, we recommend confirming any expectations your partners might have for field types for any specific fields that will be used in questions.

  1. From the row for the newly created data connection, click the More Options menu (the three dots) and then click Edit Mapping.

    mceclip2.png

    The Map Fields screen opens, and the file column names auto-populate.

    data_cxn_mapping_mapfields.png
  2. For any columns that you do not want to be queryable, slide the Include toggle to the left.

  3. If needed, update any column labels.

    Note

    Ignore the field delimiter fields because this was defined in a previous step.

  4. Click Next.

    The Add Metadata screen opens.

    image-20240612-162557.png
  5. For any column that contains PII data, slide the PII toggle to the right.

  6. Select the data type for each column.

  7. If a column contains PII, slide the User Identifiers toggle to the right and then select the user identifier that defines the PII data.

  8. Click Save.

Your data connection configuration is now complete and the status changes to "Completed".

You can now provision the resulting dataset to your desired Hybrid, Confidential Computing, or BIgQuery clean room.