September 2023 Release Notes
See below for what's been updated and what's new with Habu.
New and Noteworthy
New and Improved Tools
Clean Room User List Activation (Hybrid and Snowflake Patterns)
Hybrid and Snowflake Clean Room users can activate User List Question runs to any of Habu’s Activation Channels (Facebook, Twitter, Snap, etc.).
How to access: See "Clean Room Activation"
Default Roles for New Organizations
Habu now provides common role types to give users a starting point to set up a well-defined permission structure. Default Roles are optional.
How to access: See "Managing LiveRamp Clean Room Users"
Habu Intelligence Bookmarks
To help with navigation, Habu Intelligence users can now set up Habu Intelligence Bookmarks. Bookmarks are shown above the Habu Intelligence module and can route a user directly to specific Dashboards through easy-access buttons.
Data Connections
Salesforce CRM Connector
Using a Salesforce Connected App, users can create Habu Data Connections to any objects and fields in their Salesforce instance.
Cloud Integrations
Habu is continuing to expand our integrations with Cloud Data Clean Room products, offering a single platform where clients can manage all of their data collaboration activity.
Databricks Clean Rooms
Using the Databricks Pattern, Habu can now offer Databricks Clean Room.
Google BigQuery Clean Rooms
Using the BigQuery Pattern, Habu can now offer BigQuery Clean Rooms.
Snowflake Updates
Note
Users must be on the latest version of the Habu Snowflake CLI (version 4.1.1) in order to use the below enhancements.
Habu has added several enhancements to our Snowflake Clean Rooms:
Analytic Pipelines: Snowflake Clean Room Owners can now share run outputs to Habu’s and Partners’ Snowflake instances.
Snowpark: Habu Snowflake now supports multi-party computing within Snowpark. This includes using Python in Data Collaborations, using common packages like Pandas and sci-kit, and bringing in external functions for use in Snowpark.
External Table Setup: Habu has streamlined the external setup process, allowing for non-Snowflake customers to bring their S3- and GCS-housed data into Snowflake Clean Rooms.
Data Export to Customer S3
Habu is now able to send Question Runs (List and Analytical) to client S3 buckets. S3 buckets can be utilized for reporting (for example, source data for BI tools) or for activation.
Platform Integrations
Activation Channels and Functionality Added
HEM-based Activation on DV360
REPLACE functionality for TikTok
Offline Integrations for Redditt
Resolved Fixes and Usability Enhancements
Added a Records Sent column to Data Out metrics
Enabled users to Delete Clean Rooms
External APIs synced with Python APIs
Improved Search, Filter, Sort, and List Pagination
Added the ability to schedule runs by time of day (UTC)
Added alerts for Breaking Changes in Data Connection Mapping
Improved Data In warnings and validations for unsupported headers, folder path checks, and data connection status