Skip to main content

Configure a Google Cloud Storage Data Connection (LiveRamp-Hosted)

LiveRamp Clean Room’s application layer enables companies to securely connect distributed datasets with full control and flexibility while protecting the privacy of consumers and the rights of data owners.

To configure a LiveRamp-hosted Google Cloud Storage (GCS) data connection, see the instructions below.

Note

You can connect GCS to LiveRamp Clean Room from your own GCS account instead of using a LiveRamp-hosted GCS account. For more information, see "Configure a Google Cloud Storage Data Connection (Customer-Hosted)".

Overall Steps

Perform the following overall steps to configure a LiveRamp-hosted GCS data connection:

For information on performing these steps, see the sections below.

Guidelines

Review the following guidelines before starting the setup process:

  • LiveRamp Clean Room supports CSV and Parquet files, as well as multi-part files. All files should have a file extension. All CSV files must have a header in the first row. Headers should not have any spaces or special characters and should not exceed 50 characters. An underscore can be used in place of a space.

  • The folder where the data files are dropped can optionally include a date macro. The path in the data-in configuration should be like the following with the date macro abc/xyx/{yyyy-MM-dd} and the actual data files should be under the appropriate date folder. The date macros can appear anywhere in the path. The date must be within seven days of job creation.

  • LiveRamp encourages the use of partition columns for optimal question run performance.

Generate a Google Cloud Storage Database in LiveRamp Clean Room

To generate a GCS database in LiveRamp Clean Room:

  1. From the navigation pane, select Data ManagementData Source Locations.

  2. In the row for Habu Google Cloud Storage, click Generate Location.

    data_source_locations.png

    Note

    These credentials may also be generated when creating a new data connection.

Add the Credentials

To add credentials:

  1. From the LiveRamp Clean Room navigation pane, select Data ManagementCredentials.

  2. In the row for the Habu Google Service Account Credential Source, select "Activate." from the Actions menu

    activate_gcs_creds.png
  3. Review the credentials information and then click ACTIVATE CREDENTIALS.

    image idm3272

    The next screen displays the Google Project ID and the Credential JSON.

  4. Copy and store the credentials in a secure location.

Use the credentials to authorize and send files to the LiveRamp-hosted GCS bucket generated in the previous procedure.

Create the Data Connection

After you've added the credentials to LiveRamp Clean Room, create the data connection:

  1. From the LiveRamp Clean Room navigation pane, select Data ManagementData Connections.

  2. From the Data Connections page, click New Data Connection.

    data_cxn_new.png
  3. From the New Data Connection screen, select "Habu Google Cloud Storage".

    Screenshot 2024-03-28 at 12.29.07.png
  4. If you've already generated credentials, they will automatically populate. Otherwise, you can generate or regenerate credentials from this page.

  5. Configure the data connection:

    Screenshot 2024-03-28 at 12.31.38.png
    • Name: Enter a name of your choice.

    • Category: Enter a category of your choice.

    • Dataset Type: Select Generic.

    • Sample File Path: Do not enter anything in this field. We will use the data location specified above to locate the file.

    • File Format: Select CSV.

      Note

      • All files must have a header in the first row. Headers should not have any spaces or special characters and should not exceed 50 characters. An underscore can be used in place of a space.

      • If you are uploading a CSV file, avoid double quotes in your data (such as "First Name" or "Country").

    • Field Delimiter: If you are uploading CSV files, select the delimiter to use (comma, semicolon, pipe, or tab).

    • Data Location: The Data Location will automatically populate with the GCS bucket location generated in the "Generate a Google Cloud Storage Database in LiveRamp Clean Room" section above including the date macro and refresh type. For example, "gs://habu-client-org-123ab456-7d89-10e1-a234-567b891c0123/purchase_events/{yyyy-MM-dd}/incremental" (remove the brackets from the date shown in the example).

  6. Review the data connection details and click Save Data Connection.

    Note

    All configured data connections can be seen on the Data Connections page.

  7. Upload your data files to your specified location.

When a connection is initially configured, it will show "Verifying Access" as the configuration status. Once the connection is confirmed and the status has changed to "Mapping Required", map the table's fields.

habu_goog_waiting.png

You will receive file processing notifications via email.

Map the Fields

Once the connection is confirmed and the status has changed to "Mapping Required", map the table's fields and add metadata:

  1. From the row for the newly-created data connection, click the More Options menu (the three dots) and then click Edit Mapping.

    The Map Fields screen opens and the file column names auto-populate.

    data_cxn_mapping_mapfields.png
  2. For any columns that you do not want to be queryable, slide the Include toggle to the left.

  3. If needed, update any column labels.

    Note

    Ignore the field delimiter fields because this was defined in a previous step.

  4. Click Next.

    The Add Metadata screen opens.

    data_cxn_mapping_mapmetadata.png
  5. For any column that contains PII data, slide the PII toggle to the right.

  6. Select the data type for each column.

  7. For columns that you want to partition, slide the Allow Partitions toggle to the right.

  8. If a column contains PII, slide the User Identifiers toggle to the right and then select the user identifier that defines the PII data.

  9. Click Save.

Your data connection configuration is now complete and the status changes to "Completed".