Allow LiveRamp to Access Your Databricks Data
If your data is stored in Databricks (for example, as Delta tables), you can allow LiveRamp to ingest that data from your Databricks workspace by having a data source created in LiveRamp Connect.
Note
This workflow is in limited availability and is by invitation only.
Provide the following details to your LiveRamp customer success manager (CSM) for each Databricks environment you want LiveRamp to ingest data from:
Data source name: The name you want to use for this data source.
Server hostname: The Databricks workspace hostname (for example, adb-123456789012345.7.azuredatabricks.net).
Port: The HTTPS port used for connections (typically 443).
HTTP path: The HTTP path for the SQL warehouse or cluster LiveRamp should connect to (for example, “/sql/1.0/warehouses/abc123def456”).
Catalog (if applicable): The name of the Databricks Unity catalog that contains the schemas and tables you want LiveRamp to ingest.
Schema name: The name of the schema within that catalog.
Table name: The name of the specific table that should be ingested.
Columns to ingest: Whether LiveRamp should ingest all columns or only specific columns (provide the exact column names).
Authentication: Include the following information:
Username or service principal name with read access to the tables you listed.
Databricks personal access token for that user or service principal.
Tip
Next Steps
For offline (PII-based) data, if this is the first file you're uploading to this dataset, create a support case so the support team can make sure everything ingests correctly. See 'Considerations When Uploading the First File to a Dataset" for more information.
Within about 20 minutes after uploading, you can check the ingestion status of your uploaded files on the Files page, either by clicking the "GO TO FILES PAGE" link or by selecting Data In → Files in the navigation menu. For more information on the ingestion process for Activation workflow files, see "Overview of the File Ingestion Process for Activation Workflow Files".
Once your data has been ingested (usually within 1-3 days), you can manage it in Connect and distribute it to your desired destinations.