Skip to content

Conversation

johnmicahreid
Copy link
Contributor

No description provided.

Copy link

netlify bot commented Jul 30, 2025

Deploy Preview for snowplow-docs ready!

Name Link
🔨 Latest commit 6358029
🔍 Latest deploy log https://app.netlify.com/projects/snowplow-docs/deploys/689358bd3abe290008fe6782
😎 Deploy Preview https://deploy-preview-1348--snowplow-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@johnmicahreid johnmicahreid requested a review from mscwilson August 4, 2025 08:14
@mscwilson mscwilson force-pushed the add-segment-migration-guide branch from 7e419cd to 6358029 Compare August 6, 2025 13:29
@johnmicahreid johnmicahreid marked this pull request as ready for review August 8, 2025 09:01
Copy link
Contributor Author

@johnmicahreid johnmicahreid left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mscwilson overall this is better, and I like having the text shorter. It does feel a bit too terse and low-level at the moment though - it only focuses on tracking, and assumes a lot of knowledge about Snowplow. I wouldn't mind adding a bit more context about our platform, bearing in mind the likely audience is someone who is looking to understand the overall differences and what they should be thinking about.

@@ -8,22 +8,7 @@ This guide helps technical implementers migrate from Segment to Snowplow.

## Platform differences

There are a number of differences between Segment and Snowplow as a data platform.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this feature comparison table is actually quite important - we can probably make the text a bit less marketing-speaky and more neutral, but it's important to lay out the differences at a higher level.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that depends what this document is for. I think overall comparisons of the platforms is a marketing thing, they can already read about that on our marketing website https://snowplow.io/comparisons/snowplow-vs-segment

we could link to that comparison?

this is about how to technically migrate - what concepts map to what, what do they need to action to make it work

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yup, let's at least link to the comparison

2. Implement and validate
3. Cutover and finalize

### Assess and plan
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The section from here on out feels a bit too bullet-point-y - definitely AI-generated, but with not quite enough context or detail to actually be useful. We should assume that people reading this migration guide are not actually familiar with Snowplow, so taking a bit of extra time to explain concepts in more detail could be helpful.

I also find myself wanting numbers? e.g. "1. Assess and plan", "1.1 Audit existing implementation" etc


Segment loads each custom event type into separate tables, for example, `order_completed`, or `product_viewed` tables. Analysts must `UNION` tables together to reconstruct user journeys.

Snowplow uses a single [`atomic.events`](/docs/fundamentals/canonical-event/index.md) table in warehouses like Snowflake and BigQuery. Events and entities are stored as structured columns within that table, simplifying analysis.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would also mention that we provide OOTB data models to model common analytics use-cases, and link to our DBT packages. Otherwise we risk this migration guide focusing solely on the upstream tracking bit, vs comparing one data platform to another.

- Migrate BI dashboards to query Snowplow tables
- Test all data-dependent workflows

#### Configure integrations
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we could say more here - both of these are full products in their own right, and ProfServ has mentioned that these have been quite tricky to do in the past.

- Translate Segment events into Snowplow self-describing events
- The [Snowplow CLI](/docs/data-product-studio/snowplow-cli/index.md) MCP server can help with this
- Identify reusable entities that can replace repeated properties
- Create JSON schemas for all events and entities
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should mention Data Products here - it is about more than just events and entities. And this isn't quite the right order of operations - the MCP server does actually create the JSON schemas.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants