Skip to main content

Perform RampID Translation in BigQuery

LiveRamp's translation capabilities in BigQuery allow for the translation of a RampID from one partner encoding to another using either maintained or derived RampIDs. This allows you to match persistent pseudonymous identifiers to one another and enables use of the data without sharing the sensitive underlying identifiers.

Note

Specifically, RampID translation enables:

  • Person-based analytics

  • Increased match rates in data collaboration

  • Measurement enablement across device types

These capabilities are available within BigQuery through a LiveRamp solution, which creates a share to your account, opening up a view to query the reference data set from within your own BigQuery environment. See "LiveRamp Embedded Identity in BigQuery" for more information.

Overall Steps

Once you’ve enabled LiveRamp Embedded Identity in BigQuery, performing a translation operation involves performing the following tasks:

Note

For instructions on enabling Live Ramp Embedded Identity in BigQuery, see “Enabling LiveRamp Embedded Identity in BigQuery”.

  1. You prepare the input and metadata tables to be used for translation.

  2. You share the tables and datasets with LiveRamp.

  3. You call the shared stored procedure to initiate the translation operation, referencing your tables.

  4. LiveRamp processes the input and writes the output to the output dataset designated. Once this process has been completed, LiveRamp will email you to confirm completion.

See the sections below for information on performing these tasks.

Authentication

The LiveRamp Identity Service in BigQuery relies on the same authentication service as LiveRamp's AbiliTec and RampID APIs (Identity APIs). If you have credentials to those APIs, you can use your previously assigned credentials.

Note

Coordinate with LiveRamp to enable these API’s credentials for use with embedded identity.

Authenticating with LiveRamp's GCP service requires a call on behalf of the customer to LiveRamp's core services.

Client credentials are used to obtain an access token by passing the client ID and client secret values.

Prepare the Tables for Translation

Translation with the LiveRamp solution requires the preparation and deployment of two tables:

  • A metadata table, indicated in the code as <{{translation_meta_table}}>.

    Note

    As long as the column names in the input table stay the same, the original metadata table can be reused for multiple operations. You only need to create a new metadata table if you change the column names in the input table.

  • An input table, indicated in the code as <{{translation_input_table}}>.

    Note

    An input table needs to be prepared for each translation operation.

You can create these tables inside BigQuery or import the tables into your database using BigQuery's standard methods. The <...> variables may be substituted with your own values. Be sure to reference the names correctly in the metadata table, which has as its default name <transcoding_meta_table>, and make sure that the column names also match up correctly.

When creating tables, keep the following guidelines in mind (in addition to the guidelines listed in the sections below):

  • Every column name must be unique in a table.

  • Try not to use additional columns in the tables required for the translation operation. Having extra columns slows down processing.

  • The translate operation can process records containing blank fields.

Table Naming Guidelines

When naming tables, follow these guidelines:

  • Table names must use ASCII characters and not contain either spaces or special characters such as !@#$%.

  • Table names can use underscores “_” within the name, but not as the initial character.

  • Consider using the following elements in your table names: type of data or description, a date or timestamp, and an identity designation. For example, the table name Identity_TwoButtonSuitsCampaign_impressions_2022-06-01 contains all three element types.

Metadata Table Columns and Descriptions

The metadata table passes the required credentials, specifies the type of operation, and specifies the column names in the input table to reference for the original RampIDs, the domain to translate to, and the identifier type.

As long as the column names in the input table stay the same, the original metadata table can be reused for multiple operations. You only need to create a new metadata table if you change the column names in the input table.

Metadata column names must match those shown in the table below. The column names are not case sensitive, and should not be enclosed in single or double quotation marks.

Column

Description

CLIENT_ID

Enter either an existing CLIENT_ID or a new one provided in implementation.

CLIENT_SECRET

Enter the password/secret for the CLIENT_ID.

EXECUTION_MODE

Transcoding.

EXECUTION_TYPE

Transcoding.

TARGET_COLUMN

Enter the column name of the input table which contains the RampIDs to be translated.

TARGET_DOMAIN_COLUMN

Enter the column name of the input table which contains the target domain for the encoding the RampIDs should be translated to.

TARGET_TYPE_COLUMN

Enter the column name of the input table which contains the target identifier type.

Here is an example of BigQuery SQL for creating a metadata table:

// Example SQL query to create a metadata table for translation

CREATE OR REPLACE TABLE <dataset>.<metadata_table_name> (
CLIENT_ID STRING, 
CLIENT_SECRET STRING, 
EXECUTION_MODE STRING, 
EXECUTION_TYPE STRING, 
TARGET_COLUMN STRING, 
TARGET_DOMAIN_COLUMN STRING, 
TARGET_TYPE_COLUMN STRING
);

For each run, if any changes are to be made to the metadata table to configure the run differently, update the metadata table.

// Example SQL query to insert a row into the metadata table for translation

insert into <dataset>.<metadata_table> values ('<client_id>', '<client_secret>', 'transcoding', 'TRANSCODING', 'RAMPID', 'TARGET_DOMAIN', 'TARGET_TYPE');

Input Table Columns and Descriptions

An input table needs to be prepared for each translation operation.

The column names for the input table can be whatever you want to use, as long as the names match the values specified in the metadata table.

Column

Sample

Description

rampid

XYT999RkQ3MEY1RUYtNUIyMi00QjJGLUFDNjgtQjQ3QUEwMTNEMTA1CgMjVBMkNEMTktRD

RampID (maintained or derived) for translation.

Target_Domain

T001

The four-character alphanumeric target domain:

  • Enter a partner’s domain when translating from your native encoding to that partner’s domain.

  • Enter your domain when translating from a partner’s encoding to your native encoding.

Target_Type

RampID

Target type. Currently only "RampID" is supported.

Here are some examples of BigQuery SQL for creating this table:

// Example SQL query to create an input table for translation

CREATE OR REPLACE TABLE  <dataset>.<input_table_name> (
RAMPID STRING, 
TARGET_DOMAIN STRING, 
TARGET_TYPE STRING
);

// Or setup against an existing table

CREATE OR REPLACE VIEW <dataset>.<view_name> as 
SELECT 
d.rampid as RAMPID, 
'1234' as TARGET_DOMAIN, 
'RAMPID' as TARGET_TYPE 
FROM my_bq_table d;

Insert all RampIDs encoded in your domain into the newly created input table.

// Example SQL query to insert a row into the input table for translation

INSERT INTO <dataset>.<input_table> values ('some_ramp_id', '1234', 'RAMPID');

The output table is created after you share the tables and run the shared “bq_lr_invoke” stored procedure in the "Call the Shared Stored Procedure to Initiate Translation" section below.

Note

You can translate both maintained RampIDs and derived RampIDs in your table. For more on RampID types and versions, see “RampID”.

Share Tables and Datasets with LiveRamp

In order for LiveRamp to be able to process the data, the tables and dataset must be shared with LiveRamp’s GCP Principal account (LiveRamp will share this service account ID during initial implementation).

The following script shows an example and the correct permissions to share the two tables created above and the output dataset.

GRANT `roles/bigquery.dataViewer`
ON TABLE <dataset_name>.<input_table_name>
TO "serviceaccount:<LiveRamp Principal Account ID>";

GRANT `roles/bigquery.dataViewer`
ON TABLE <dataset_name>.<metadata_table_name>
TO "serviceaccount:<LiveRamp Principal Account ID>";

-- GRANT Write for Output Tables
GRANT `roles/bigquery.dataEditor`
ON SCHEMA <output_dataset_name>
TO "serviceaccount:<LiveRamp Principal Account ID>";

Optional Share and Query Data

You can use a parameterized and repeatable share script that selects the appropriate data to share with LiveRamp afterwards. See the code below for an example.

BEGIN
  -------------------------------------------------------------------------------
  -- UPDATE VARIABLES WITH CORRECT FULLY QUALIFIED TABLE & DATASET IDENTIFIERS --
  -------------------------------------------------------------------------------
  -- input_table is table that has ramp_ids to translate from, target_domain, and type
  DECLARE input_table STRING DEFAULT 'liveramp_consumer_tests.transcoding_input';
  -- input_metadata table has arugments for your run including client_id / client_secret
  DECLARE input_metadata STRING DEFAULT 'liveramp_consumer_tests.metadata_input';
  -- output_dataset is an empty dataset you created to share to LiveRamp to write outputs to.
  DECLARE output_dataset STRING DEFAULT 'liveramp_consumer_tests_output';
  -- project_name is the project in which your input tables and output_dataset lives.
  DECLARE project_name STRING DEFAULT 'unrestricted-coding';
  -------------------------------------------------------------------------------
  -- END VARIABLES DO NOT EDIT BELOW THIS
  -------------------------------------------------------------------------------
  DECLARE liveramp_iam_principle STRING DEFAULT 'user:new.user@liveramp.com';

  -- GRANT Read of Input Tables
  EXECUTE IMMEDIATE "GRANT `roles/bigquery.dataViewer` ON TABLE " || input_table || " TO '" || liveramp_iam_principle || "';";
  EXECUTE IMMEDIATE "GRANT `roles/bigquery.dataViewer` ON TABLE " || input_metadata || " TO '" || liveramp_iam_principle || "';";

  -- GRANT Write for Output Tables
  EXECUTE IMMEDIATE "GRANT `roles/bigquery.dataEditor` ON SCHEMA " || output_dataset || " TO '" || liveramp_iam_principle || "';";

  with fqdns AS
    (
      SELECT STRUCT("input_table" as key, project_name || "." || input_table as fqdn) as item
      UNION ALL
      SELECT STRUCT("input_metadata" as key, project_name || "." || input_metadata as fqdn) as item
      UNION ALL
      SELECT STRUCT("output_dataset" as key, project_name || "." || output_dataset as fqdn) as item
    )
  SELECT f.item.*
  FROM fqdns f;

END

Call the Shared Stored Procedure to Initiate Translation

LiveRamp will share a stored procedure to initiate the translation operation to a service account principal you provide, The stored procedure will invoke a remote connection from BigQuery to a cloud function hosted in the LiveRamp environment which will run your job on your behalf. LiveRamp will also create a job_history_table in the shared output dataset which will allow you to track the status of your jobs via a UUID returned by the invocation procedure.

Here is an example of BigQuery SQL for calling the invocation procedure:

DECLARE input_table STRING DEFAULT NULL;
DECLARE meta_table STRING DEFAULT NULL;
DECLARE output_table STRING DEFAULT NULL;

-- This example assumes you've created your input and meta tables ahead of time based on your workflow type

SET input_table = '<project_name>.<dataset_name>.<input_table_name>’;
SET meta_table = '<project_name>.<dataset_name>.<metadata_table_name>’;
SET output_table = '<project_name>.<output_dataset_name>.<output_table_name>’;

CALL `eng-id-embedded-prod-producer.invoke_ds.bq_lr_invoke`(input_table, meta_table, output_table); 

LiveRamp will process the input and write the output to the output dataset designated.