Skip to main content

Allow LiveRamp to Access Your Azure Blob Storage Data

If your data files are stored in Microsoft Azure Blob Storage, you can allow LiveRamp to ingest those files from your storage account by having a data source created in LiveRamp Connect. 

Note

This workflow is in limited availability and is by invitation only.

Provide the following details to your LiveRamp customer success manager (CSM) for each Azure container you want LiveRamp to ingest data from:

  • Data source name: The name you want to use for this data source.

  • Container name: The name of the container that contains the files you want LiveRamp to ingest.

  • Source file path: The folder path and, if needed, file pattern within the container that identifies the files to ingest. Examples:

    • input/activation/

    • input/activation/*.csv

    • events/daily/*.parquet

  • Azure connection string: A connection string that grants read access to the container and path you specified (including permissions to list and read blobs).

  • File format details: File format (for example, CSV, TSV, or Parquet) and any parsing details that matter for ingestion, such as:

    • Field delimiter.

    • Presence of a header row.

    • Character encoding.

Tip

Next Steps

For offline (PII-based) data, if this is the first file you're uploading to this dataset, create a support case so the support team can make sure everything ingests correctly. See 'Considerations When Uploading the First File to a Dataset" for more information.

Within about 20 minutes after uploading, you can check the ingestion status of your uploaded files on the Files page, either by clicking the "GO TO FILES PAGE" link or by selecting Data InFiles in the navigation menu. For more information on the ingestion process for Activation workflow files, see "Overview of the File Ingestion Process for Activation Workflow Files".

Once your data has been ingested (usually within 1-3 days), you can manage it in Connect and distribute it to your desired destinations.