Skip to main content

Using Code Sync


If you would like early access to the following Code Sync features, create a Safe Haven support case in the LiveRamp Community portal.

You can synchronize code files in your organization's repository with the Analytics Environment in the following ways:

This allows you to use code that you develop outside of LiveRamp and manage it in GitHub.

Adding GitHub Actions

Once you add Webhook and Collaborator actions to your GitHub account, you can manage code sync for your organization and your partners and then access it in Analytics Environment using the GitLab extension in JupyterLab.

Procedure. To add a Webhook action:
  1. In GitHub, select the Settings tab.

  2. In the "Code and automation" section, click Webhooks.

  3. Click Add webhook. The "Add webhook" page is displayed.

  4. In the Payload URL box, enter

  5. From the Content type list, select application/json.

  6. In the "Which events would you like to trigger this webhook?" section, select the Just the push event option.

  7. Click Add webhook.

Procedure. To add a Collaborator action:
  1. In GitHub, select the Settings tab.

  2. In the "Access" section, click Collaborators.

  3. Click Add people. The "Add collaborator" page is displayed.

  4. In the box, enter the Safe Haven public GitHub account:

  5. Click Add lsh-ae-code-sync to this repository.

Clone a Repository

You can use the GitLab extension to clone the repository that you want to sync with Analytics Environment. For information on setting up the GitLab extension, see "Use JupyterLab's GitLab Extension."

To set up the code sync, you will need to adjust the following GitLab SSH URL for your organization's region, tenant ID, and repo:



  • {region}: Is your organization's region, which can be eu, us, or au.

  • {tenant_ID}: Is your tenant ID, which was provided to you by your LiveRamp representative.

  • {repo_name}: Is the name of your GitHub repository.

Procedure. To clone your repository:
  1. Log in to Analytics Environment and open JupyterLab.

  2. Open your home folder.

  3. Click the GitLab extension icon GitLab-Extension-Icon.png.

  4. Click Clone a Repository. The Clone a repo dialog is displayed.

  5. Enter the GitLab SSH URL that you prepared in the box and click Clone.


Clone a Repository at the Terminal

Alternatively, you can clone your repository at the Terminal in JupyterLab.

Procedure. To clone a repository at the terminal:
  1. Log in to Analytics Environment and open JupyterLab.

  2. Click the Terminal tile.

  3. In the terminal pane, enter the GitLab SSH URL for your organization's region, tenant ID, and repo:

    git clone git@gitlab-{region}.datalake-loading.internal:grp-tenant-{tenant_ID}/{repo_name}.git

Using the Code Sync API

You can use the Code Sync API to send your code files from your code repository to your Analytics Environment, enabling the use of code that you develop outside of LiveRamp.

To use the Code Sync API, you need the following account credentials to obtain an access token:

  • A username, which is your service account ID

  • A password (also known as a "secret key" because it is encrypted)

Procedure. To add the credentials to your GitHub account:
  1. In your GitHub account, select the "Settings" tab.

  2. In the "Security" section, select SecretsActions.

  3. On the "Actions secrets" page, click New repository secret and add your password and username.


If you lack these credentials or need a LiveRamp Service Account, create a Safe Haven case in the LiveRamp Community portal or contact your Customer Success Manager. You should receive a JSON file that includes the needed credentials.


Access tokens expire at the interval specified by the expires_in value. For example, "expires_in":600 indicates 10 minutes (600 seconds). To ensure that calls are successful, set up an automated way to fetch new access tokens.

Request an Access Token

Use the following cURL command to request an access token:

curl -X POST \
  --header 'Content-Type: application/x-www-form-urlencoded' \
  -d grant_type=password \
  -d username=$USERNAME \
  -d password=$PASSWORD \
  -d client_id="liveramp-api" \
  -d response_type=token

Where: $USERNAME is your service account ID and $PASSWORD is your secret key.

The response will include the access token and indicate an expires_in value that is expressed in seconds, such as 600 (10 minutes). You must use the access token to make any API requests before it expires.

Encode Your File

Use the following Base64 encryption command to encode a file located in your repo and save it as a post_data.json file:

base64Data="`base64 {codefile} -w 0`"
echo "{\"base64Data\": \"{base64Data}\",\"bucket\": \"coderepo\",\"filePath\":\"upload/bigquery_tool-0.1.0-py3.8.egg\",\"tenantId\": {tenant_ID}}" >post_data.json

Where: {codefile} is your code file and {tenant_ID} is your organization's tenant ID.

Send Your File to Your Analytics Environment

Use the following cURL command to POST your post_data.json file from your repository to your Analytics Environment:

curl --location --request POST '' \
     --header "Authorization: Bearer {access_token}" \
     --header 'Content-Type: application/json' \
     -d @post_data.json

Where: {access_token} is replaced with the access token you received.

Adding a GitHub Workflow

You can create a GitHub workflow that will call the Code Sync API to sync your code.

Procedure. To add a workflow:
  1. In GitHub, select the Actions tab.

  2. In the Choose a workflow section, click set up a workflow yourself.

  3. Enter the code that contains your GitHub actions.


Sample GitHub Workflow Code

# This is a basic workflow to help you get started with Actions
name: Code Sync Workflow
# Use the on key to specify what events trigger your workflow
  # Controls when the workflow will run
    # The push event has a branches filter that causes your workflow to run only when a push to a branch that matches the branches filter occurs, instead of when any push occurs
    # Replace the branch name "test-egg-build-push" with your branch name
    - test-egg-build-push
  # Allows you to run this workflow manually from the Actions tab
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
  # This workflow contains a single job called "build_and_sync"
    # The type of runner that the job will run on
    runs-on: ubuntu-latest
    # Steps represent a sequence of tasks that will be executed as part of the job
      # Checks out your repository under $GITHUB_WORKSPACE, so your job can access it
      - uses: actions/checkout@v3
          # Replace the branch name "test-egg-build-push" and specify the branch that you want to pull
          ref: test-egg-build-push
      - name: Build the project
        run: |
          cd ./python
          python ./ bdist_egg 
      - name: Call API to upload the file
          # LR_API_USER and LR_API_PASSWORD should be provided by LiveRamp, open your Github project and go to Settings -> Security -> Actions, add it and use in the following code
          LR_API_USER: ${{ secrets.LR_API_USER }}
          LR_API_PASSWORD: ${{ secrets.LR_API_PASSWORD }}
        run: |
          result=$(curl -X POST --header 'Content-Type: application/x-www-form-urlencoded' --data "${DATA}" ${TOKEN_URL})
          echo $result
          # Get the access_token from response
          access_token=($(echo $result | python3 -c "import sys, json; print(json.load(sys.stdin)['access_token'])"))
          echo "$access_token"
          # Encode the content with base64
          # Replace the "./python/dist/bigquery_tool-0.1.0-py3.8.egg" with your file that you want to upload
          base64Data="`base64 ./python/dist/bigquery_tool-0.1.0-py3.8.egg -w 0`"
          echo ${base64Data}
          # This call will upload the file bigquery_tool-0.1.0-py3.8.egg to the LiveRamp default code GCS bucket
          # Replace the "000000" with your own tenant id 
          curl --location --request POST '' \
          --header "Authorization: Bearer ${access_token}" \
          --header 'Content-Type: application/json' \
          --data-raw "{
              \"base64Data\": \"${base64Data}\",
              \"bucket\": \"coderepo\",
              \"filePath\": \"upload/bigquery_tool-0.1.0-py3.8.egg\",
              \"tenantId\": 000000