Skip to content

Commit 44e8a81

Browse files
Merge pull request #6731 from segmentio/develop
Release 24.25.2
2 parents 662ccd5 + 067c1b5 commit 44e8a81

File tree

9 files changed

+194
-8
lines changed

9 files changed

+194
-8
lines changed

src/_data/catalog/destination_categories.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# AUTOGENERATED FROM PUBLIC API. DO NOT EDIT
2-
# destination categories last updated 2024-06-18
2+
# destination categories last updated 2024-06-20
33
items:
44
- display_name: A/B Testing
55
slug: a-b-testing

src/_data/catalog/destinations.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# AUTOGENERATED FROM PUBLIC API. DO NOT EDIT
2-
# destination data last updated 2024-06-18
2+
# destination data last updated 2024-06-20
33
items:
44
- id: 637e8d185e2dec264895ea89
55
display_name: 1Flow
@@ -89648,9 +89648,9 @@ items:
8964889648
- Email Marketing
8964989649
- Marketing Automation
8965089650
logo:
89651-
url: https://cdn.filepicker.io/api/file/UzxHop1USzuvi18EU2U8
89651+
url: https://cdn-devcenter.segment.com/dab5dd17-7d99-4012-b1f4-03f3415d986b.svg
8965289652
mark:
89653-
url: https://cdn.filepicker.io/api/file/ipio14InSTGUFALKnHjn
89653+
url: https://cdn-devcenter.segment.com/d6ca24ea-5285-42a2-a025-c7a8fe14c608.svg
8965489654
methods:
8965589655
track: true
8965689656
identify: true

src/_data/catalog/destinations_private.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# AUTOGENERATED FROM PUBLIC API. DO NOT EDIT
2-
# destination data last updated 2024-06-18
2+
# destination data last updated 2024-06-20
33
items:
44
- id: 54521fd925e721e32a72eee1
55
display_name: Pardot

src/_data/catalog/regional-supported.yml

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1097,6 +1097,15 @@ sources:
10971097
endpoints:
10981098
- us
10991099
- eu
1100+
- id: q4JbVJwmrg
1101+
display_name: Yotpo
1102+
hidden: false
1103+
slug: yotpo
1104+
url: connections/sources/catalog/cloud-apps/yotpo
1105+
regions:
1106+
- us
1107+
endpoints:
1108+
- us
11001109
- id: 117eYCe9jH
11011110
display_name: Youbora
11021111
hidden: true

src/_data/catalog/source_categories.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# AUTOGENERATED FROM PUBLIC API. DO NOT EDIT
2-
# source categories last updated 2024-06-18
2+
# source categories last updated 2024-06-20
33
items:
44
- display_name: A/B Testing
55
slug: a-b-testing

src/_data/catalog/sources.yml

Lines changed: 21 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# AUTOGENERATED FROM PUBLIC API. DO NOT EDIT
2-
# sources last updated 2024-06-18
2+
# sources last updated 2024-06-20
33
items:
44
- id: 8HWbgPTt3k
55
display_name: .NET
@@ -1905,7 +1905,8 @@ items:
19051905
19061906
We will update SendGrid data every ~3 hours.
19071907
logo:
1908-
url: https://d3hotuclm6if1r.cloudfront.net/logos/sendgrid-default.svg
1908+
url: >-
1909+
https://cdn-devcenter.segment.com/b59eb729-568c-4461-8a8d-62ee9896c520.svg
19091910
categories:
19101911
- Email Marketing
19111912
- id: GCeG0vmcDW
@@ -2245,6 +2246,24 @@ items:
22452246
url: https://cdn.filepicker.io/api/file/dx6hyOr7S7qEZkTtzNMj
22462247
categories:
22472248
- Mobile
2249+
- id: q4JbVJwmrg
2250+
display_name: Yotpo
2251+
isCloudEventSource: true
2252+
slug: yotpo
2253+
url: connections/sources/catalog/cloud-apps/yotpo
2254+
hidden: false
2255+
regions:
2256+
- us
2257+
endpoints:
2258+
- us
2259+
source_type: cloud-app
2260+
description: Personalized, automated campaigns driving eCommerce revenue
2261+
logo:
2262+
url: >-
2263+
https://cdn-devcenter.segment.com/045a832e-e8d3-41e0-bfa9-b97bc854c363.svg
2264+
categories:
2265+
- Marketing Automation
2266+
- Email Marketing
22482267
- id: 117eYCe9jH
22492268
display_name: Youbora
22502269
isCloudEventSource: true

src/connections/destinations/catalog/klaviyo/index.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,11 @@ To configure Klaviyo as an Event Source to get data into your warehouse or other
2626
2727
## Migrate to the Klaviyo (Actions) destination
2828

29+
> info ""
30+
> Segment is not deprecating Klaviyo Classic destinations that use a Web Device Mode configuration. Users that have destinations with this configuration **do not need to take any action**.
31+
>
32+
> This migration applies **only** to Klaviyo Classic destinations in Cloud Mode.
33+
2934
Starting on June 20th, 2024, Segment will automatically migrate all classic Klaviyo destinations to the new Klaviyo (Actions) destination. Migrated Klaviyo (Actions) destinations will have the same name as your classic destination, with "Migrated" appended.
3035

3136
For example, if you named your classic destination "Email Marketing Campaigns", Segment would name your migrated destination "Email Marketing Campaigns Migrated".

src/segment-app/iam/scim.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -117,6 +117,9 @@ Instructions for configuring Microsoft Entra ID can be found on the Microsoft Do
117117

118118
2. [Complete the Microsoft Entra ID setup guide for SCIM](https://learn.microsoft.com/en-us/entra/identity/saas-apps/segment-provisioning-tutorial){:target="_blank”}
119119

120+
> info ""
121+
> To make Azure compatible with Segment's SCIM v2 implementation, append the flag `?aadOptscim062020` to the tenant URL as explained in the [Microsoft Entra ID documentation](https://learn.microsoft.com/en-us/entra/identity/app-provisioning/application-provisioning-config-problem-scim-compatibility#flags-to-alter-the-scim-behavior){:target="_blank”}. By appending the flag to your tenant URL, your request has the correct structure when you remove a user from a group.
122+
120123
## OneLogin Setup Guide
121124

122125
Instructions for configuring OneLogin can be found on the OneLogin Docs website.
Lines changed: 150 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,150 @@
1+
---
2+
title: Databricks Setup
3+
beta: true
4+
plan: unify
5+
hidden: true
6+
---
7+
8+
> info "Linked Profiles is in public beta"
9+
> Linked Profiles (Data Graph, Linked Events, and Linked Audiences) is in public beta, and Segment is actively working on this feature. Some functionality may change before it becomes generally available.
10+
11+
On this page, you'll learn how to connect your Databricks data warehouse to the Segment Data Graph.
12+
13+
## Set up Databricks credentials
14+
15+
Sign in to Databricks with admin permissions to create new resources and provide the Data Graph with the necessary permissions.
16+
17+
Segment assumes that you already have a workspace that includes the datasets you'd like to use for the Data Graph. Segment recommends setting up a new Service Principal user with only the permissions to access the required catalogs and schemas.
18+
19+
### Step 1: Set up a Service Principal user
20+
21+
Segment recommends that you set up a new Service Principal user. If you already have a Service Principal user you'd like to use, grant it "Can use" permissions for your data warehouse and proceed to [Step 2: Create a catalog for Segment to store checkpoint tables](#step-2-create-a-catalog-for-segment-to-store-checkpoint-tables).
22+
23+
If you want to create a new Service Principal user, complete the following substeps:
24+
25+
#### Substep 1: Create a new Service Principal user
26+
1. Log in to the Databricks UI as an Admin.
27+
2. Click **User Management**.
28+
3. Select the **Service principals** tab.
29+
4. Click **Add Service Principal**.
30+
5. Enter a Service Principal user name and click **Add**.
31+
6. Select the Service Principal user you just created and click **Generate secret**.
32+
7. Save the **Secret** and **Client ID** to a safe place. You'll need these values to connect your Databricks warehouse to Segment.
33+
8. Navigate to Workspaces and select your Workspace.
34+
9. Select the “Permissions” tab and click **Add Permissions**.
35+
10. Add the newly created Service Principal user and click **Save**.
36+
37+
> success ""
38+
> If you already have a warehouse you'd like to use, you can move on to the next substep, [Substep 2: Add your Service Principal user to Warehouse User Lists](#substep-2-add-your-service-principal-user-to-warehouse-user-lists). If you need to create a new warehouse first, see the [Create a new warehouse](#create-a-new-warehouse) before completing the next substep.
39+
40+
#### Substep 2: Add your Service Principal user to Warehouse User Lists
41+
1. Log in to the Databricks UI as an Admin.
42+
2. Navigate to SQL Warehouses.
43+
3. Select your warehouse and click **Permissions**.
44+
4. Add the Service Principal user and grant them “Can use” access.
45+
5. Click **Add**.
46+
47+
##### (Optional) Confirm Service Principal permissions
48+
Confirm that the Service Principal user that you're using to connect to Segment has "Can use" permissions for your warehouse.
49+
50+
To confirm that your Service Principal user has "Can use" permission:
51+
1. In the Databricks console, navigate to SQL Warehouses and select your warehouse.
52+
2. Navigate to Overview and click **Permissions**.
53+
3. Verify that the Service Principal user has "Can use" permission.
54+
55+
### Step 2: Create a catalog for Segment to store checkpoint tables
56+
57+
> warning "Segment recommends creating an empty catalog for the Data Graph"
58+
> If you plan to use an existing catalog with Reverse ETL, follow the instructions in the [Update user access for Segment Reverse ETL catalog](#update-user-access-for-segment-reverse-etl-catalog) section.
59+
60+
Segment requires write access to a catalog to create a schema for internal bookkeeping, and to store checkpoint tables for the queries that are executed.
61+
62+
Segment recommends creating an empty catalog for this purpose by running the following SQL. This is also the catalog that you'll be required to specify when setting up your Databricks integration in the Segment app.
63+
64+
```sql
65+
CREATE CATALOG IF NOT EXISTS `SEGMENT_LINKED_PROFILES_DB`;
66+
-- Copy the Client ID by clicking “Generate secret” for the Service Principal user
67+
GRANT USAGE ON CATALOG `SEGMENT_LINKED_PROFILES_DB` TO `${client_id}`;
68+
GRANT CREATE ON CATALOG `SEGMENT_LINKED_PROFILES_DB` TO `${client_id}`;
69+
GRANT SELECT ON CATALOG `SEGMENT_LINKED_PROFILES_DB` TO `${client_id}`;
70+
```
71+
72+
### Step 3: Grant read-only access to the Profiles Sync catalog
73+
74+
Run the following SQL to grant the Data Graph read-only access to the Profiles Sync catalog:
75+
76+
```sql
77+
GRANT USAGE, SELECT, USE SCHEMA ON CATALOG `${profiles_sync_catalog}` TO `${client_id}`;
78+
```
79+
80+
### Step 4: Grant read-only access to additional catalogs for the Data Graph
81+
Run the following SQL to grant your Service Principal user read-only access to any additional catalogs you want to use for the Data Graph:
82+
83+
```sql
84+
-- Run this command for each catalog you want to use for the Segment Data Graph
85+
GRANT USAGE, SELECT, USE SCHEMA ON CATALOG `${catalog}` TO `${client_id}`;
86+
```
87+
88+
### (Optional) Restrict read-only access to schemas
89+
90+
Restrict access to specific schemas by running the following SQL:
91+
92+
```sql
93+
GRANT USAGE ON CATALOG `${catalog}` TO `${client_id}`;
94+
USE CATALOG `${catalog}`;
95+
GRANT USAGE, SELECT ON SCHEMA `${schema_1}` TO `${client_id}`;
96+
GRANT USAGE, SELECT ON SCHEMA `${schema_2}` TO `${client_id}`;
97+
...
98+
99+
```
100+
101+
### (Optional) Restrict read-only access to tables
102+
Restrict access to specific tables by running the following SQL:
103+
104+
```sql
105+
GRANT USAGE ON CATALOG `${catalog}` TO `${client_id}`;
106+
USE CATALOG `${catalog}`;
107+
GRANT USAGE ON SCHEMA `${schema_1}` TO `${client_id}`;
108+
USE SCHEMA `${schema_1}`;
109+
GRANT SELECT ON TABLE `${table_1}` TO `${client_id}`;
110+
GRANT SELECT ON TABLE `${table_2}` TO `${client_id}`;
111+
...
112+
113+
```
114+
115+
### Step 5: Validate the permissions of your Service Principal user
116+
117+
Sign in to the [Databricks CLI with your Client ID secret](https://docs.databricks.com/en/dev-tools/cli/authentication.html#oauth-machine-to-machine-m2m-authentication){:target="_blank”} and run the following SQL to verify the Service Principal user has the correct permissions for a given table.
118+
119+
> success ""
120+
> If this command succeeds, you can view the table.
121+
122+
```sql
123+
USE DATABASE ${linked_read_only_database} ;
124+
SHOW SCHEMAS;
125+
SELECT * FROM ${schema}.${table} LIMIT 10;
126+
```
127+
128+
### Step 6: Connect your warehouse to Segment
129+
130+
Segment requires the following settings to connect to your Databricks warehouse. You can find these details in your Databricks workspace by navigating to **SQL Warehouse > Connection details**.
131+
132+
- **Hostname**: The address of your Databricks server
133+
- **Http Path**: The address of your Databricks compute resources
134+
- **Port**: The port used to connect to your Databricks warehouse. The default port is 443, but your port might be different.
135+
- **Catalog**: The catalog you designated in [Step 2: Create a catalog for Segment to store checkpoint tables](#step-2-create-a-catalog-for-segment-to-store-checkpoint-tables)
136+
- **Service principal client ID**: The client ID used to access to your Databricks warehouse
137+
- **OAuth secret**: The OAuth secret used to connect to your Databricks warehouse
138+
139+
After identifying the following settings, continue setting up the Data Graph by following the instructions in [Connect your warehouse to the Data Graph](/docs/unify/linked-profiles/data-graph/#step-2-connect-your-warehouse-to-the-data-graph).
140+
141+
## Additional set up for warehouse permissions
142+
143+
### Update user access for Segment Reverse ETL catalog
144+
Run the following SQL if you run into an error on the Segment app indicating that the user doesn’t have sufficient privileges on an existing `_segment_reverse_etl` schema.
145+
146+
If Segment Reverse ETL has ever run in the catalog you are configuring as the Segment connection catalog, a Segment-managed schema is already created and you need to provide the new Segment user access to the existing schema. Update the Databricks table permissions by running the following SQL:
147+
148+
```sql
149+
GRANT ALL PRIVILEGES ON SCHEMA ${segment_internal_catalog}.__segment_reverse_etl TO `${client_id}`;
150+
```

0 commit comments

Comments
 (0)