diff --git a/.github/styles/Snowplow/Acronyms.yml b/.github/styles/Snowplow/Acronyms.yml index b2111bc7d..5d2877f65 100644 --- a/.github/styles/Snowplow/Acronyms.yml +++ b/.github/styles/Snowplow/Acronyms.yml @@ -65,7 +65,7 @@ exceptions: - ZIP # Added for Snowplow - - BDP + - CDI - DNS - SQS - UUID diff --git a/.github/styles/config/vocabularies/snowplow/accept.txt b/.github/styles/config/vocabularies/snowplow/accept.txt index d74edc7b5..de99a60a1 100644 --- a/.github/styles/config/vocabularies/snowplow/accept.txt +++ b/.github/styles/config/vocabularies/snowplow/accept.txt @@ -53,7 +53,7 @@ Data Product Studio Data Model Packs Data Model Pack AWS Console -BDP Console +Snowplow Console Iglu Central Iglu Server Igluctl diff --git a/CLAUDE.md b/CLAUDE.md index cd0d6042a..4a60ef951 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -48,7 +48,7 @@ * Mix prose and lists appropriately: use prose to explain concepts, lists for configuration options or step-by-step items ### Snowplow terminology -* **Capitalized**: Data Product Studio, Snowtype, Snowplow BDP, Signals +* **Capitalized**: Data Product Studio, Snowtype, Snowplow CDI, Signals * **Context-capitalized**: Collector, Enrich, specific Loaders * **Not capitalized**: entities, events, schemas, data structures * Use "entity" not "context", "self-describing event" not "unstructured event" @@ -120,11 +120,6 @@ Important cautions about data loss, security, or breaking changes * **Tip**: performance improvements, recommended workflows, pro tips * **Warning**: data loss risks, security considerations -### BDP vs self-hosted -* Mark BDP-only features in frontmatter: `sidebar_custom_props: offerings: - bdp` -* Note when BDP provides built-in functionality, mention self-hosted alternatives -* **Do not mention "Community Edition"** - use "self-hosted" instead but otherwise prefer not to mention - ### Images * Formats: `.webp` (preferred), `.png`, `.jpg` * Descriptive filenames and alt text diff --git a/README.md b/README.md index a2c792a36..f9ebf7b4a 100644 --- a/README.md +++ b/README.md @@ -10,7 +10,6 @@ This is the source for https://docs.snowplow.io/docs. - [Organizing content](#organizing-content) - [Sidebar](#sidebar) - [Updating sidebar attributes for multiple sections at once](#updating-sidebar-attributes-for-multiple-sections-at-once) - - [Offerings](#offerings) - [Links](#links) - [Concepts](#concepts) - [Reusable fragments](#reusable-fragments) @@ -143,26 +142,6 @@ It'll update the `index.md` files as appropriate. You can now delete the `update_attributes_here.txt` file. -### Offerings - -Some documentation is only relevant to a particular offering. You can indicate it like this: -``` ---- -title: ... -... -sidebar_custom_props: - offerings: - - bdp -... ---- -``` - -This will result in an icon appearing in the sidebar, as well as an automatic banner on the page, specifying that the docs only apply to a given offering. - -The available values are: `bdp` and `community`. Do not specify both values at once — if a piece of documentation is relevant to all offerings, there should be no `offerings` property as that’s the default. - -Whenever the same functionality can be achieved in multiple offerings but in a different way (e.g. managing schemas), create a parent folder (“Managing schemas”) that’s offering-neutral, and then add offering-specific pages inside it. This way, other pages can link to the generic page without having to specify different methods for different offerings. - ### Links For links within this documentation, please end the link with `/index.md`. This way all links will be checked, and you’ll get an error if a link is broken at any point. diff --git a/docs/account-management/index.md b/docs/account-management/index.md index 228a9f541..3436f74f1 100644 --- a/docs/account-management/index.md +++ b/docs/account-management/index.md @@ -4,23 +4,21 @@ date: "2020-02-15" sidebar_position: 9 sidebar_custom_props: header: " " - offerings: - - bdp sidebar_label: "Account management" --- -Manage your account configuration and users using the Snowplow BDP Console. You can also use the underlying API directly. This page describes how to acquire an API key. +Manage your account configuration and users using the Snowplow Console. You can also use the underlying API directly. This page describes how to acquire an API key. ## Credentials API -The API that drives BDP Console's functionality is [publicly documented](https://console.snowplowanalytics.com/api/msc/v1/docs/index.html?url=/api/msc/v1/docs/docs.yaml) and available for our customers to invoke via code. All calls to it need to be properly authenticated using JSON Web Tokens (JWT) that can be acquired via the Credentials API. +The API that drives Console's functionality is [publicly documented](https://console.snowplowanalytics.com/api/msc/v1/docs/index.html?url=/api/msc/v1/docs/docs.yaml) and available for our customers to invoke via code. All calls to it need to be properly authenticated using JSON Web Tokens (JWT) that can be acquired via the Credentials API. The process for creating a key has been improved over time. We recommend using the v3 process. ### Version 3 -The following view is available to all customers under [BDP Console settings](https://console.snowplowanalytics.com/credentials): +The following view is available to all customers under [Console settings](https://console.snowplowanalytics.com/credentials): ![](images/accessing-generated-api-keys.png) @@ -95,10 +93,10 @@ Authenticating with v2 only required the API key secret. While this method and t ### Version 1 -Previously, BDP Console was using the Password authentication flow to support machine-to-machine (m2m) applications. Under that scenario a BDP customer had to create a bot user in their account, retrieve a client ID and a client secret, and use all three to acquire a JWT. Customers who have enabled these credentials in the past will see the following panel in their Console account settings: +Previously, Console was using the Password authentication flow to support machine-to-machine (m2m) applications. Under that scenario a customer had to create a bot user in their account, retrieve a client ID and a client secret, and use all three to acquire a JWT. Customers who have enabled these credentials in the past will see the following panel in their Console account settings: ![](images/image-2.png) -Legacy Snowplow BDP credentials management +Legacy Snowplow credentials management This method and the respective credentials still work for those who have been using them, however we strongly advise that customers upgrade to the current iteration where the only secret to be used by m2m applications is an API key which can be exchanged for a JWT. diff --git a/docs/account-management/managing-permissions/index.md b/docs/account-management/managing-permissions/index.md index 30051d565..ea037cd45 100644 --- a/docs/account-management/managing-permissions/index.md +++ b/docs/account-management/managing-permissions/index.md @@ -8,7 +8,7 @@ To set a users permissions, navigate to `Manage users` and then to the user whos ## What permissions can be set? -Snowplow BDP Console sets permissions for each area of Console as summarized below: +Snowplow Console sets permissions for each area of Console as summarized below: | **Console feature** | **Description** | **Possible permissions** | | ------------------- | ----------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------- | diff --git a/docs/api-reference/enrichment-components/configuration-reference/index.md b/docs/api-reference/enrichment-components/configuration-reference/index.md index 070979bbc..7c626b76d 100644 --- a/docs/api-reference/enrichment-components/configuration-reference/index.md +++ b/docs/api-reference/enrichment-components/configuration-reference/index.md @@ -32,8 +32,8 @@ To accept the terms of license and run Enrich, set the `ACCEPT_LIMITED_USE_LICEN | `monitoring.metrics.statsd.prefix` | Optional. Default: `snowplow.enrich`. Pefix of StatsD metric names. | | `monitoring.healthProbe.port` (since *6.0.0*) | Optional. Default: `8000`. Open a HTTP server that returns OK only if the app is healthy. | | `monitoring.healthProbe.unhealthyLatency` (since *6.0.0*) | Optional. Default: `2 minutes`. Health probe becomes unhealthy if any received event is still not fully processed before this cutoff time. | -| `telemetry.disable` | Optional. Set to `true` to disable [telemetry](/docs/get-started/snowplow-community-edition/telemetry/index.md). | -| `telemetry.userProvidedId` | Optional. See [here](/docs/get-started/snowplow-community-edition/telemetry/index.md#how-can-i-help) for more information. | +| `telemetry.disable` | Optional. Set to `true` to disable [telemetry](/docs/get-started/self-hosted/telemetry/index.md). | +| `telemetry.userProvidedId` | Optional. See [here](/docs/get-started/self-hosted/telemetry/index.md#how-can-i-help) for more information. | | `validation.acceptInvalid` (since *6.0.0*) | Optional. Default: `false`. Enrich *3.0.0* introduces the validation of the enriched events against atomic schema before emitting. If set to `false`, a failed event will be emitted instead of the enriched event if validation fails. If set to `true`, invalid enriched events will be emitted, as before. | | `validation.atomicFieldsLimits` (since *4.0.0*) | Optional. For the defaults, see [here](https://github.com/snowplow/enrich/blob/master/modules/common/src/main/resources/reference.conf). Configuration for custom maximum atomic fields (strings) length. It's a map-like structure with keys being atomic field names and values being their max allowed length. | | `validation.maxJsonDepth` (since *6.0.0*) | Optional. Default: `40`. Maximum allowed depth for the JSON entities in the events. Event will be sent to bad row stream if it contains JSON entity with a depth that exceeds this value. | diff --git a/docs/api-reference/failed-events/index.md b/docs/api-reference/failed-events/index.md index 57a59562a..003ac9852 100644 --- a/docs/api-reference/failed-events/index.md +++ b/docs/api-reference/failed-events/index.md @@ -32,7 +32,7 @@ In order for an event to be processed successfully: If your pipeline is generating schema violations, it might mean there is a problem with your tracking, or a problem with your [Iglu resolver](/docs/api-reference/iglu/iglu-resolver/index.md) which lists where schemas should be found. The error details in the schema violation JSON object should give you a hint about what the problem might be. -Snowplow BDP customers should check in the Snowplow BDP Console that all data structures are correct and have been [promoted to production](/docs/data-product-studio/data-structures/manage/index.md). Snowplow Community Edition users should check that the Enrichment app is configured with an [Iglu resolver file](/docs/api-reference/iglu/iglu-resolver/index.md) that points to a repository containing the schemas. +Snowplow customers should check in the Snowplow Console that all data structures are correct and have been [promoted to production](/docs/data-product-studio/data-structures/manage/index.md). Snowplow Self-Hosted users should check that the Enrichment app is configured with an [Iglu resolver file](/docs/api-reference/iglu/iglu-resolver/index.md) that points to a repository containing the schemas. Next, check the tracking code in your custom application, and make sure the entities you are sending conform to the schema definition. diff --git a/docs/api-reference/index.md b/docs/api-reference/index.md index 5ae47634a..305da46d1 100644 --- a/docs/api-reference/index.md +++ b/docs/api-reference/index.md @@ -7,4 +7,4 @@ sidebar_label: "Reference" This section contains detailed technical information about Snowplow components. -Some of the information is relevant only for [Community Edition](/docs/get-started/snowplow-community-edition/index.md) users, as [Snowplow BDP](/docs/get-started/snowplow-bdp/index.md) customers won't need to configure all their own components. +Some of the information is relevant only for [Snowplow Self-Hosted](/docs/get-started/index.md#self-hosted) users, as [Snowplow CDI](/docs/get-started/index.md#customer-data-infrastructure) customers won't need to configure all their own components. diff --git a/docs/api-reference/loaders-storage-targets/databricks-streaming-loader/configuration-reference/_common_config.md b/docs/api-reference/loaders-storage-targets/databricks-streaming-loader/configuration-reference/_common_config.md index 918733e5a..2fd2c69b1 100644 --- a/docs/api-reference/loaders-storage-targets/databricks-streaming-loader/configuration-reference/_common_config.md +++ b/docs/api-reference/loaders-storage-targets/databricks-streaming-loader/configuration-reference/_common_config.md @@ -94,11 +94,11 @@ import Link from '@docusaurus/Link'; telemetry.disable - Optional. Set to true to disable telemetry. + Optional. Set to true to disable telemetry. telemetry.userProvidedId - Optional. See here for more information. + Optional. See here for more information. http.client.maxConnectionsPerServer diff --git a/docs/api-reference/loaders-storage-targets/lake-loader/configuration-reference/_common_config.md b/docs/api-reference/loaders-storage-targets/lake-loader/configuration-reference/_common_config.md index 340399a03..eb86d469f 100644 --- a/docs/api-reference/loaders-storage-targets/lake-loader/configuration-reference/_common_config.md +++ b/docs/api-reference/loaders-storage-targets/lake-loader/configuration-reference/_common_config.md @@ -112,11 +112,11 @@ import Link from '@docusaurus/Link'; telemetry.disable - Optional. Set to true to disable telemetry. + Optional. Set to true to disable telemetry. telemetry.userProvidedId - Optional. See here for more information. + Optional. See here for more information. inMemBatchBytes diff --git a/docs/api-reference/loaders-storage-targets/snowflake-streaming-loader/configuration-reference/_common_config.md b/docs/api-reference/loaders-storage-targets/snowflake-streaming-loader/configuration-reference/_common_config.md index aff261c2c..b3cab3f1a 100644 --- a/docs/api-reference/loaders-storage-targets/snowflake-streaming-loader/configuration-reference/_common_config.md +++ b/docs/api-reference/loaders-storage-targets/snowflake-streaming-loader/configuration-reference/_common_config.md @@ -92,11 +92,11 @@ import Link from '@docusaurus/Link'; telemetry.disable - Optional. Set to true to disable telemetry. + Optional. Set to true to disable telemetry. telemetry.userProvidedId - Optional. See here for more information. + Optional. See here for more information. output.good.jdbcLoginTimeout diff --git a/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/rdb-loader-configuration-reference/index.md b/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/rdb-loader-configuration-reference/index.md index 52221edeb..591b78b56 100644 --- a/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/rdb-loader-configuration-reference/index.md +++ b/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/rdb-loader-configuration-reference/index.md @@ -170,8 +170,8 @@ Only Snowflake Loader can be run on Azure at the moment. | `initRetries.strategy` | Backoff strategy used during retry. The possible values are `JITTER`, `CONSTANT`, `EXPONENTIAL`, `FIBONACCI`. | | `initRetries.attempts` | Optional. How many attempts to make before sending the message into retry queue. If missing, `cumulativeBound` will be used. | | `initRetries.cumulativeBound` | Optional. When backoff reaches this delay, eg '1 hour', the loader will stop retrying. If both this and `attempts` are not set, the loader will retry indefinitely. | -| `telemetry.disable` | Optional. Set to `true` to disable [telemetry](/docs/get-started/snowplow-community-edition/telemetry/index.md). | -| `telemetry.userProvidedId` | Optional. See [here](/docs/get-started/snowplow-community-edition/telemetry/index.md#how-can-i-help) for more information. | +| `telemetry.disable` | Optional. Set to `true` to disable [telemetry](/docs/get-started/self-hosted/telemetry/index.md). | +| `telemetry.userProvidedId` | Optional. See [here](/docs/get-started/self-hosted/telemetry/index.md#how-can-i-help) for more information. | ## Common monitoring settings diff --git a/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/snowflake-loader/index.md b/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/snowflake-loader/index.md index b3d9d6f46..0e3bffe50 100644 --- a/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/snowflake-loader/index.md +++ b/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/snowflake-loader/index.md @@ -14,7 +14,7 @@ It is possible to run Snowflake Loader on AWS, GCP and Azure. ### Setting up Snowflake -You can use the steps outlined in our [quick start guide](/docs/get-started/snowplow-community-edition/quick-start/index.md?warehouse=snowflake#prepare-the-destination) to create most of the necessary Snowflake resources. +You can use the steps outlined in our [quick start guide](/docs/get-started/self-hosted/quick-start/index.md?warehouse=snowflake#prepare-the-destination) to create most of the necessary Snowflake resources. There are two different authentication methods with Snowflake Loader: * With the `TempCreds` method, there are no additional Snowflake resources needed. @@ -43,7 +43,7 @@ Finally, use the `transformedStage` [configuration setting](/docs/api-reference/ ### Running the loader -There are dedicated terraform modules for deploying Snowflake Loader on [AWS](https://registry.terraform.io/modules/snowplow-devops/snowflake-loader-ec2/aws/latest) and [Azure](https://github.com/snowplow-devops/terraform-azurerm-snowflake-loader-vmss). You can see how they are used in our full pipeline deployment examples [here](/docs/get-started/snowplow-community-edition/quick-start/index.md). +There are dedicated terraform modules for deploying Snowflake Loader on [AWS](https://registry.terraform.io/modules/snowplow-devops/snowflake-loader-ec2/aws/latest) and [Azure](https://github.com/snowplow-devops/terraform-azurerm-snowflake-loader-vmss). You can see how they are used in our full pipeline deployment examples [here](/docs/get-started/self-hosted/quick-start/index.md). We don't have a terraform module for deploying Snowflake Loader on GCP yet. Therefore, it needs to be deployed manually at the moment. diff --git a/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/transforming-enriched-data/reusable/stream-transformer-common/_index.mdx b/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/transforming-enriched-data/reusable/stream-transformer-common/_index.mdx index 7db7dfbf4..14f6725db 100644 --- a/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/transforming-enriched-data/reusable/stream-transformer-common/_index.mdx +++ b/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/transforming-enriched-data/reusable/stream-transformer-common/_index.mdx @@ -36,11 +36,11 @@ import Link from '@docusaurus/Link'; telemetry.disable - Optional. Set to true to disable telemetry. + Optional. Set to true to disable telemetry. telemetry.userProvidedId - Optional. See here for more information. + Optional. See here for more information. monitoring.sentry.dsn diff --git a/docs/api-reference/snowbridge/configuration/telemetry/index.md b/docs/api-reference/snowbridge/configuration/telemetry/index.md index 20eaa80c1..918f35429 100644 --- a/docs/api-reference/snowbridge/configuration/telemetry/index.md +++ b/docs/api-reference/snowbridge/configuration/telemetry/index.md @@ -6,7 +6,7 @@ sidebar_position: 500 # Telemetry Configuration -You can read about our telemetry principles [here](/docs/get-started/snowplow-community-edition/telemetry/index.md). +You can read about our telemetry principles [here](/docs/get-started/self-hosted/telemetry/index.md). ## Configuration options @@ -16,7 +16,7 @@ Enabling telemetry: # Optional. Set to true to disable telemetry. disable_telemetry = false -# Optional. See here for more information: https://docs.snowplow.io/docs/get-started/snowplow-community-edition/telemetry/#how-can-i-help +# Optional. See here for more information: https://docs.snowplow.io/docs/get-started/self-hosted/telemetry/#how-can-i-help user_provided_id = "elmer.fudd@acme.com" ``` diff --git a/docs/api-reference/snowplow-mini/control-plane-api/index.md b/docs/api-reference/snowplow-mini/control-plane-api/index.md index accbffed4..5611f90a0 100644 --- a/docs/api-reference/snowplow-mini/control-plane-api/index.md +++ b/docs/api-reference/snowplow-mini/control-plane-api/index.md @@ -64,7 +64,7 @@ where `service_name` can be one of the following: `collector`, `enrich`, `esLoad #### Configuring telemetry -See our [telemetry principles](/docs/get-started/snowplow-community-edition/telemetry/index.md) for more information on telemetry. +See our [telemetry principles](/docs/get-started/self-hosted/telemetry/index.md) for more information on telemetry. HTTP GET to get current configuration diff --git a/docs/api-reference/snowplow-mini/index.md b/docs/api-reference/snowplow-mini/index.md index 568443e3f..296704488 100644 --- a/docs/api-reference/snowplow-mini/index.md +++ b/docs/api-reference/snowplow-mini/index.md @@ -10,7 +10,7 @@ sidebar_position: 120 Snowplow Mini is similar to [Snowplow Micro](/docs/data-product-studio/data-quality/snowplow-micro/index.md), with the following differences: * Micro is more portable and can easily run on your machine or in automated tests. -* Mini has more features, mainly an OpenSearch Dashboards UI, and is better integrated with Snowplow BDP. +* Mini has more features, mainly an OpenSearch Dashboards UI, and is better integrated with Snowplow. ::: @@ -21,9 +21,9 @@ You might use Snowplow Mini when: ## Getting started -Snowplow BDP users can request a Snowplow Mini instance through the console (go to `“Environments” → “Sandboxes” → “Setup a sandbox”`). +Snowplow users can request a Snowplow Mini instance through Console (go to `“Environments” → “Sandboxes” → “Setup a sandbox”`). -For Community Edition, see the setup guides for [AWS](/docs/api-reference/snowplow-mini/setup-guide-for-aws/index.md) and [GCP](/docs/api-reference/snowplow-mini/setup-guide-for-gcp/index.md). +For Snowplow Self-Hosted, see the setup guides for [AWS](/docs/api-reference/snowplow-mini/setup-guide-for-aws/index.md) and [GCP](/docs/api-reference/snowplow-mini/setup-guide-for-gcp/index.md). ## Conceptual diagram diff --git a/docs/api-reference/snowplow-mini/previous-releases/snowplow-mini-0-14/control-plane-api/index.md b/docs/api-reference/snowplow-mini/previous-releases/snowplow-mini-0-14/control-plane-api/index.md index 2daa2706b..f61cec2b8 100644 --- a/docs/api-reference/snowplow-mini/previous-releases/snowplow-mini-0-14/control-plane-api/index.md +++ b/docs/api-reference/snowplow-mini/previous-releases/snowplow-mini-0-14/control-plane-api/index.md @@ -64,7 +64,7 @@ where `service_name` can be one of the following: `collector`, `enrich`, `esLoad #### Configuring telemetry -See our [telemetry principles](/docs/get-started/snowplow-community-edition/telemetry/index.md) for more information on telemetry. +See our [telemetry principles](/docs/get-started/self-hosted/telemetry/index.md) for more information on telemetry. HTTP GET to get current configuration diff --git a/docs/api-reference/snowplow-mini/previous-releases/snowplow-mini-0-14/usage-guide/index.md b/docs/api-reference/snowplow-mini/previous-releases/snowplow-mini-0-14/usage-guide/index.md index 1b9acf459..06327a84d 100644 --- a/docs/api-reference/snowplow-mini/previous-releases/snowplow-mini-0-14/usage-guide/index.md +++ b/docs/api-reference/snowplow-mini/previous-releases/snowplow-mini-0-14/usage-guide/index.md @@ -197,7 +197,7 @@ where `service_name` can be one of the following: `collector`, `enrich`, `esLoad ## Configuring telemetry -See our [telemetry principles](/docs/get-started/snowplow-community-edition/telemetry/index.md) for more information on telemetry. +See our [telemetry principles](/docs/get-started/self-hosted/telemetry/index.md) for more information on telemetry. HTTP GET to get current configuration diff --git a/docs/api-reference/snowplow-mini/usage-guide/index.md b/docs/api-reference/snowplow-mini/usage-guide/index.md index cabd244c3..6767dffc3 100644 --- a/docs/api-reference/snowplow-mini/usage-guide/index.md +++ b/docs/api-reference/snowplow-mini/usage-guide/index.md @@ -198,7 +198,7 @@ where `service_name` can be one of the following: `collector`, `enrich`, `esLoad ## Configuring telemetry -See our [telemetry principles](/docs/get-started/snowplow-community-edition/telemetry/index.md) for more information on telemetry. +See our [telemetry principles](/docs/get-started/self-hosted/telemetry/index.md) for more information on telemetry. HTTP GET to get current configuration diff --git a/docs/api-reference/stream-collector/configure/index.md b/docs/api-reference/stream-collector/configure/index.md index 46aa9a17e..545c4136f 100644 --- a/docs/api-reference/stream-collector/configure/index.md +++ b/docs/api-reference/stream-collector/configure/index.md @@ -62,8 +62,8 @@ collector { | `collector.preTerminationPeriod` (since *2.5.0*) | Optional. Default: `10 seconds`. Configures how long the collector should pause after receiving a sigterm before starting the graceful shutdown. During this period the collector continues to accept new connections and respond to requests. | | `collector.prometheusMetrics.enabled` (deprecated since *2.6.0*) | Optional. Default: `false`. When enabled, all requests are logged as prometheus metrics and the `/metrics` endpoint returns the report about the metrics. | | `collector.prometheusMetrics.durationBucketsInSeconds` (deprecated since *2.6.0*) | Optional. E.g. `[0.1, 3, 10]`. Custom buckets for the `http_request_duration_seconds_bucket` duration prometheus metric. | -| `collector.telemetry.disable` | Optional. Set to `true` to disable [telemetry](/docs/get-started/snowplow-community-edition/telemetry/index.md). | -| `collector.telemetry.userProvidedId` | Optional. See [here](/docs/get-started/snowplow-community-edition/telemetry/index.md#how-can-i-help) for more information. | +| `collector.telemetry.disable` | Optional. Set to `true` to disable [telemetry](/docs/get-started/self-hosted/telemetry/index.md). | +| `collector.telemetry.userProvidedId` | Optional. See [here](/docs/get-started/self-hosted/telemetry/index.md#how-can-i-help) for more information. | | `collector.compression.enabled` (since *3.6.0*) | Optional. Default: `false`. Enable compression on the output. Compression should only be enabled with Enrich >=6.1.0. | | `collector.compression.type` (since *3.6.0*) | Optional. Default: `zstd`. Compression algorithm to use. | | `collector.compression.gzipCompressionLevel` (since *3.6.0*) | Optional. Default: `6`. The compression level for GZIP compression. It is between 1 and 9. Lower levels have faster compression speed, but worse compression ratio. | diff --git a/docs/api-reference/versions/index.md b/docs/api-reference/versions/index.md index a95c4c40a..eecf0d822 100644 --- a/docs/api-reference/versions/index.md +++ b/docs/api-reference/versions/index.md @@ -20,9 +20,9 @@ You might encounter specific restrictions when following the documentation, for ## Upgrades and deprecation -:::info Snowplow BDP +:::info Snowplow CDI -If you are using Snowplow BDP, you don’t need to deal with upgrading your pipeline, as we perform upgrades for you. +If you are a Snowplow CDI customer, rather than self-hosted, you don't need to deal with upgrading your pipeline. We'll perform upgrades for you. ::: @@ -36,9 +36,9 @@ From time to time, we develop better applications for certain tasks and deprecat ### Core pipeline -:::info Snowplow BDP +:::info Snowplow CDI -If you are using Snowplow BDP, you don’t need to install any of the core pipeline components yourself. We deploy your pipeline and keep it up to date. +If you are a Snowplow CDI customer, rather than self-hosted, you don't need to install any of the core pipeline components yourself. We'll deploy your pipeline and keep it up to date. ::: @@ -96,9 +96,9 @@ If you are using Snowplow BDP, you don’t need to install any of the core pipel ### Iglu (schema registry) -:::info Snowplow BDP +:::info Snowplow CDI -If you are using Snowplow BDP, you don’t need to install Iglu Server yourself. It’s also unlikely that you need to use any of the other components in this section. You can manage your data structures [in the UI or via the API](/docs/data-product-studio/data-structures/manage/index.md). +If you are a Snowplow CDI customer, rather than self-hosted, you don't need to install Iglu Server yourself. It's also unlikely that you need to use any of the other components in this section. You can manage your data structures [in the UI or via the API](/docs/data-product-studio/data-structures/manage/index.md). ::: @@ -173,9 +173,9 @@ import ModelVersionsSqlRunner from '@site/docs/modeling-your-data/modeling-your- ### Testing and debugging -:::info Snowplow BDP +:::info Snowplow CDI -If you are using Snowplow BDP, you don’t need to install Snowplow Mini yourself. We (optionally) deploy it and keep it up to date for you. +If you are a Snowplow CDI customer, rather than self-hosted, you don't need to install Snowplow Mini yourself. We can deploy it as required, and keep it up to date for you. ::: diff --git a/docs/data-product-studio/data-products/api/index.md b/docs/data-product-studio/data-products/api/index.md index aadcffa88..ba9fbbd8c 100644 --- a/docs/data-product-studio/data-products/api/index.md +++ b/docs/data-product-studio/data-products/api/index.md @@ -2,12 +2,9 @@ title: "Managing Data Products via the API" sidebar_label: "Using the API" sidebar_position: 3 -sidebar_custom_props: - offerings: - - bdp --- -As well as managing [data products](/docs/data-product-studio/data-products/index.md) through the Snowplow BDP Console, Snowplow BDP customers can also manage them programmatically through an API. +As well as managing [data products](/docs/data-product-studio/data-products/index.md) through Snowplow Console, Snowplow customers can also manage them programmatically through an API. This functionality is key to automating existing processes and frequent manual tasks, including workflows in version control systems like GitHub. diff --git a/docs/data-product-studio/data-products/cli/index.md b/docs/data-product-studio/data-products/cli/index.md index 15d0ba91e..24894dbb6 100644 --- a/docs/data-product-studio/data-products/cli/index.md +++ b/docs/data-product-studio/data-products/cli/index.md @@ -79,7 +79,7 @@ In this example event specification `All source apps` is related to both `generi ```bash ./snowplow-cli dp download ``` -This command retrieves all organization data products, event specifications, and source applications. By default, it creates a folder named `data-products` in your current working directory. You can specify a different folder name as an argument if needed. +This command retrieves all organization data products, event specifications, and source applications. By default, it creates a folder named `data-products` in your current working directory. You can specify a different folder name as an argument if needed. The command creates the following structure: - A main `data-products` folder containing your data product files - A `source-apps` subfolder containing source application definitions @@ -88,7 +88,7 @@ The command creates the following structure: ```bash ./snowplow-cli dp validate ``` -This command scans all files under `./data-products` and validates them using the BDP console. It checks: +This command scans all files under `./data-products` and validates them using Snowplow Console. It checks: 1. Whether each file is in a valid format (YAML/JSON) with correctly formatted fields 2. Whether all source application references in the data product files are valid 3. Whether event specification rules are compatible with their schemas @@ -97,4 +97,4 @@ If validation fails, the command displays the errors in the console and exits wi ```bash ./snowplow-cli dp publish ``` -This command locates all files under `./data-products`, validates them, and publishes them to the BDP console. +This command locates all files under `./data-products`, validates them, and publishes them to Console. diff --git a/docs/data-product-studio/data-products/data-product-templates/index.md b/docs/data-product-studio/data-products/data-product-templates/index.md index 62cbe3021..c93800b83 100644 --- a/docs/data-product-studio/data-products/data-product-templates/index.md +++ b/docs/data-product-studio/data-products/data-product-templates/index.md @@ -3,9 +3,6 @@ title: "Data product templates" date: "2024-06-17" sidebar_label: "Using data product templates" sidebar_position: 2 -sidebar_custom_props: - offerings: - - bdp --- ## Creating a Data Product based on Templates using Console diff --git a/docs/data-product-studio/data-products/ui/index.md b/docs/data-product-studio/data-products/ui/index.md index 90791dd08..b7bc2ac94 100644 --- a/docs/data-product-studio/data-products/ui/index.md +++ b/docs/data-product-studio/data-products/ui/index.md @@ -3,9 +3,6 @@ title: "Managing Data Product using Console" date: "2024-01-18" sidebar_label: "Using the UI" sidebar_position: 1 -sidebar_custom_props: - offerings: - - bdp --- ## Creating and editing a new Data Product using Console @@ -44,7 +41,7 @@ In the image below, you can see an example of a data product. It not only provid - **Subscribe**; receive notifications of any changes in the data product - **Implement tracking**; automatically generate the code for your data product to be included in your application (to learn more visit [Code Generation - automatically generate code for Snowplow tracking SDKs](/docs/data-product-studio/snowtype/index.md)) -*Notes: sharing and subscribing is only available for users registered in Snowplow BDP Console.* +*Notes: sharing and subscribing is only available for users registered in Snowplow Console.* ![Data product overview](images/data-product-overview.png) @@ -58,7 +55,7 @@ Data Products created prior to the release of [Source Applications](/docs/data-p ![Updating existing Data Products](images/edit-existing-data-product.png) -Event specifications which contain previously added application IDs will need to be updated to use the identifiers inherited from the Source Applications selected at Data Product level. This process can be done manually but you can reach out to our Support team to help you with that by either logging a request through our Snowplow [BDP Console](https://console.snowplowanalytics.com/) or by directly emailing [support@snowplow.io](mailto:support@snowplow.io). +Event specifications which contain previously added application IDs will need to be updated to use the identifiers inherited from the Source Applications selected at Data Product level. This process can be done manually but you can reach out to our Support team to help you with that by either logging a request through Snowplow [Console](https://console.snowplowanalytics.com/) or by directly emailing [support@snowplow.io](mailto:support@snowplow.io). ![Updating existing Event Specifications](images/edit-existing-event-specification.png) diff --git a/docs/data-product-studio/data-quality/data-structures-ci-tool/index.md b/docs/data-product-studio/data-quality/data-structures-ci-tool/index.md index 04df3bdfb..409f6d695 100644 --- a/docs/data-product-studio/data-quality/data-structures-ci-tool/index.md +++ b/docs/data-product-studio/data-quality/data-structures-ci-tool/index.md @@ -2,9 +2,6 @@ title: "Using the Data Structures CI tool" date: "2020-06-01" sidebar_position: 4 -sidebar_custom_props: - offerings: - - bdp sidebar_label: "Data Structures CI tool" --- diff --git a/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/file-storage/index.md b/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/file-storage/index.md index 9bf3963c2..1b12b0f3c 100644 --- a/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/file-storage/index.md +++ b/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/file-storage/index.md @@ -11,9 +11,9 @@ import TabItem from '@theme/TabItem'; On AWS and GCP, when failed events are generated on your pipeline, the raw event payload along with details about the failure are saved into file storage (S3 on AWS, GCS on Google Cloud). -:::info Community Edition quick start guide on GCP +:::info Snowplow Self-Hosted quick start guide on GCP -If you followed the [Community Edition quick start guide](/docs/get-started/snowplow-community-edition/quick-start/index.md) on GCP, you will need to manually deploy the [GCS Loader](/docs/api-reference/loaders-storage-targets/google-cloud-storage-loader/index.md) to save failed events into GCS, as it’s currently not included in the Terraform scripts. +If you followed the [Snowplow Self-Hosted quick start guide](/docs/get-started/self-hosted/quick-start/index.md) on GCP, you will need to manually deploy the [GCS Loader](/docs/api-reference/loaders-storage-targets/google-cloud-storage-loader/index.md) to save failed events into GCS, as it’s currently not included in the Terraform scripts. ::: @@ -112,9 +112,9 @@ Note that the SQL statements contain a few placeholders which you will need to e -:::info Community Edition quick start guide on GCP +:::info Snowplow Self-Hosted quick start guide on GCP -If you followed the [Community Edition quick start guide](/docs/get-started/snowplow-community-edition/quick-start/index.md), you will need to manually deploy the [GCS Loader](/docs/api-reference/loaders-storage-targets/google-cloud-storage-loader/index.md) to save failed events into GCS, as it’s currently not included in the Terraform scripts. +If you followed the [Snowplow Self-Hosted Quick Start guide](/docs/get-started/self-hosted/quick-start/index.md), you will need to manually deploy the [GCS Loader](/docs/api-reference/loaders-storage-targets/google-cloud-storage-loader/index.md) to save failed events into GCS, as it’s currently not included in the Terraform scripts. ::: diff --git a/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/index.md b/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/index.md index c32c28027..d8a08853a 100644 --- a/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/index.md +++ b/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/index.md @@ -89,11 +89,11 @@ Here is an example of what the `contexts_com_snowplowanalytics_snowplow_failure_ To use this feature, you will first need to enable the stream that contains failed events in the [Snowplow TSV format](/docs/fundamentals/canonical-event/understanding-the-enriched-tsv-format/index.md) suitable for loading into your warehouse or lake. -The instructions below are for Snowplow BDP users. For Community Edition, you will need to configure this manually via Terraform. +The instructions below are for Snowplow CDI customers. For Snowplow Self-Hosted, you will need to configure this manually via Terraform. :::note Infrastructure costs -An additional stream (Kinesis, Pub/Sub or Event Hubs on AWS, GCP and Azure respectively) will be reflected in your cloud infrastructure costs (unless you are using BDP Cloud). That said, failed events are usually a tiny fraction of all events, so this stream will be minimally sized. +An additional stream (Kinesis, Pub/Sub or Event Hubs on AWS, GCP and Azure respectively) will be reflected in your cloud infrastructure costs (unless you are using [Snowplow Cloud](/docs/get-started/index.md#cdi-cloud)). That said, failed events are usually a tiny fraction of all events, so this stream will be minimally sized. ::: diff --git a/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/alerts/classic-alerts/index.md b/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/alerts/classic-alerts/index.md index b1776ebcc..010c5a601 100644 --- a/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/alerts/classic-alerts/index.md +++ b/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/alerts/classic-alerts/index.md @@ -12,11 +12,11 @@ Snowplow can send two types of pipeline alerts to help you monitor failed events - **Failed event digest**: receive a daily digest of all failed event activity in the previous 48-hour period. -To receive alerts you must have the failed events [monitoring](/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md) feature switched on in Snowplow BDP Console. +To receive alerts you must have the failed events [monitoring](/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md) feature switched on in Snowplow Console. ## Subscribing to alerts -- Login to Snowplow BDP console +- Login to Snowplow Console - Locate the pipeline you wish to set up alerts for in the left-hand navigation - Click on the `Configuration` tab, then the `Pipeline alerts` section diff --git a/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/alerts/failed-event-alerts/creating-alerts/index.md b/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/alerts/failed-event-alerts/creating-alerts/index.md index fd0c74cad..9d1f58785 100644 --- a/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/alerts/failed-event-alerts/creating-alerts/index.md +++ b/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/alerts/failed-event-alerts/creating-alerts/index.md @@ -8,7 +8,7 @@ Set up failed event alerts to receive notifications when failed events occur in You'll need access to the [data quality dashboard](/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md#data-quality-dashboard). -To create an alert, go to the BDP Console: +To create an alert, go to Snowplow Console: 1. Navigate to **Data Quality** in the left sidebar 2. Click **Manage alerts** in the top-right corner 3. Click **Create alert** diff --git a/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md b/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md index eb776cad6..97f749e19 100644 --- a/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md +++ b/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md @@ -1,17 +1,14 @@ --- title: "Monitoring failed events" sidebar_position: 2 -sidebar_custom_props: - offerings: - - bdp sidebar_label: "Monitor" --- Snowplow pipelines separate events that are problematic in order to keep data quality high in downstream systems. For more information on understanding failed events see [here](/docs/fundamentals/failed-events/index.md). -## Monitoring failed events in Snowplow BDP Console +## Monitoring failed events in Snowplow Console -For Snowplow customers that would like to benefit from seeing aggregates of failed events by type, there are relevant optional features in the Snowplow BDP Console. Snowplow offers two different ways to monitor failed events: +For Snowplow customers that would like to benefit from seeing aggregates of failed events by type, there are relevant optional features in Snowplow Console. Snowplow offers two different ways to monitor failed events: - The default view with little available information to debug errors - The data quality dashboard that surfaces failed events directly from your warehouse in a secure manner, making debugging easier @@ -25,7 +22,7 @@ The default view is a relatively simple interface that shows the number of faile In this setup, you expose no additional interface to the public internet, and all failed events information is served by the Console's APIs. -Below is an example view of the failed events screen in the Snowplow BDP Console: +Below is an example view of the failed events screen in Snowplow Console: ![](images/image-1024x1024.png) diff --git a/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/troubleshooting/index.md b/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/troubleshooting/index.md index 9233f279f..3e07a4c45 100644 --- a/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/troubleshooting/index.md +++ b/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/troubleshooting/index.md @@ -1,9 +1,6 @@ --- title: "Troubleshooting data quality dashboard" sidebar_position: 4 -sidebar_custom_props: - offerings: - - bdp sidebar_label: "Troubleshooting" --- @@ -109,7 +106,7 @@ Long-running queries or resource pool exhaustion can cause the data quality dash #### Error code range - `12xxx` - `22xxx` - + #### Error description `Query exceeded timeout` or `Query execution time limit exceeded` diff --git a/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/builder/index.md b/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/builder/index.md index 170f1ee50..eb6d3506f 100644 --- a/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/builder/index.md +++ b/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/builder/index.md @@ -2,12 +2,9 @@ title: "Using the Console to request data recovery" sidebar_label: "Using the Console" sidebar_position: 0 -sidebar_custom_props: - offerings: - - bdp --- -You can use BDP Console to [submit a data recovery request](https://console.snowplowanalytics.com/recovery) to our Support team. +You can use Snowplow Console to [submit a data recovery request](https://console.snowplowanalytics.com/recovery) to our Support team. :::caution diff --git a/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/getting-started/index.md b/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/getting-started/index.md index a7f68ee72..3cd1188fb 100644 --- a/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/getting-started/index.md +++ b/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/getting-started/index.md @@ -6,7 +6,7 @@ sidebar_position: 0 Event recovery at its core, is the ability to fix events that have failed and replay them through your pipeline. -After inspecting failed events either in the [Snowplow BDP Console](/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md), or in the [partitioned failure buckets](/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/file-storage/index.md), you can determine which events are possible to recover based on what the fix entails. +After inspecting failed events either in [Snowplow Console](/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md), or in the [partitioned failure buckets](/docs/data-product-studio/data-quality/failed-events/exploring-failed-events/file-storage/index.md), you can determine which events are possible to recover based on what the fix entails. With recovery it is possible to: diff --git a/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/hints-workflows/index.md b/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/hints-workflows/index.md index ce04c1333..162f4246b 100644 --- a/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/hints-workflows/index.md +++ b/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/hints-workflows/index.md @@ -26,7 +26,7 @@ To explore the data structures we are able to use the convenience scripts as des Given we got a bad row JSON structure we can turn it into navigable JSON object that we can step through and encode/decode using for example: ```scala -val badrow = """{"schema":"iglu:com.snowplowanalytics.snowplow.badrows/enrichment_failures/jsonschema/1-0-0","data":{"processor":{"artifact":"beam-enrich","version":"1.0.0-rc5"},"failure":{"timestamp":"2020-02-17T09:28:18.100Z","messages":[{"enrichment":{"schemaKey":"iglu:com.snowplowanalytics.snowplow.enrichments/api_request_enrichment_config/jsonschema/1-0-0","identifier":"api-request"},"message":{"error":"Error accessing POJO input field [user]: [java.lang.NoSuchMethodException: com.snowplowanalytics.snowplow.enrich.common.outputs.EnrichedEvent.user_id-foo()]"}}]},"payload":{"enriched":{"app_id":"console","platform":"web","etl_tstamp":"2020-02-17 09:28:18.095","collector_tstamp":"2020-02-17 09:28:16.560","dvce_created_tstamp":"2020-02-17 09:28:16.114","event":"page_view","event_id":"2dfeb9b7-5a87-4214-8a97-a8b23176856b","txn_id":null,"name_tracker":"msc-gcp-stg1","v_tracker":"js-2.10.2","v_collector":"ssc-1.0.0-rc4-googlepubsub","v_etl":"beam-enrich-1.0.0-rc5-common-1.0.0","user_id":null,"user_ipaddress":"18.194.133.57","user_fingerprint":null,"domain_userid":"d6c468de-0aed-4785-9052-b6bb77b6dddb","domain_sessionidx":13,"network_userid":"510b2f05-27e3-4fd3-b449-a2702926da5e","geo_country":"DE","geo_region":"HE","geo_city":"Frankfurt am Main","geo_zipcode":"60313","geo_latitude":50.1188,"geo_longitude":8.6843,"geo_region_name":"Hesse","ip_isp":null,"ip_organization":null,"ip_domain":null,"ip_netspeed":null,"page_url":"https://console.snowplowanalytics.com/","page_title":"Snowplow BDP","page_referrer":null,"page_urlscheme":"https","page_urlhost":"console.snowplowanalytics.com","page_urlport":443,"page_urlpath":"/","page_urlquery":null,"page_urlfragment":null,"refr_urlscheme":null,"refr_urlhost":null,"refr_urlport":0,"refr_urlpath":null,"refr_urlquery":null,"refr_urlfragment":null,"refr_medium":null,"refr_source":null,"refr_term":null,"mkt_medium":null,"mkt_source":null,"mkt_term":null,"mkt_content":null,"mkt_campaign":null,"contexts":"{\"schema\":\"iglu:com.snowplowanalytics.snowplow/contexts/jsonschema/1-0-0\",\"data\":[{\"schema\":\"iglu:com.snowplowanalytics.snowplow/web_page/jsonschema/1-0-0\",\"data\":{\"id\":\"39a9934a-ddd3-4581-a4ea-d0ba20e63b92\"}},{\"schema\":\"iglu:org.w3/PerformanceTiming/jsonschema/1-0-0\",\"data\":{\"navigationStart\":1581931694397,\"unloadEventStart\":1581931696046,\"unloadEventEnd\":1581931694764,\"redirectStart\":0,\"redirectEnd\":0,\"fetchStart\":1581931694397,\"domainLookupStart\":1581931694440,\"domainLookupEnd\":1581931694513,\"connectStart\":1581931694513,\"connectEnd\":1581931694665,\"secureConnectionStart\":1581931694572,\"requestStart\":1581931694665,\"responseStart\":1581931694750,\"responseEnd\":1581931694750,\"domLoading\":1581931694762,\"domInteractive\":1581931695963,\"domContentLoadedEventStart\":1581931696039,\"domContentLoadedEventEnd\":1581931696039,\"domComplete\":0,\"loadEventStart\":0,\"loadEventEnd\":0}}]}","se_category":null,"se_action":null,"se_label":null,"se_property":null,"se_value":null,"unstruct_event":null,"tr_orderid":null,"tr_affiliation":null,"tr_total":null,"tr_tax":null,"tr_shipping":null,"tr_city":null,"tr_state":null,"tr_country":null,"ti_orderid":null,"ti_sku":null,"ti_name":null,"ti_category":null,"ti_price":null,"ti_quantity":0,"pp_xoffset_min":0,"pp_xoffset_max":0,"pp_yoffset_min":0,"pp_yoffset_max":0,"useragent":"Mozilla/5.0 (X11; Linux x86_64; rv:72.0) Gecko/20100101 Firefox/72.0","br_name":null,"br_family":null,"br_version":null,"br_type":null,"br_renderengine":null,"br_lang":"en-US","br_features_pdf":0,"br_features_flash":0,"br_features_java":0,"br_features_director":0,"br_features_quicktime":0,"br_features_realplayer":0,"br_features_windowsmedia":0,"br_features_gears":0,"br_features_silverlight":0,"br_cookies":1,"br_colordepth":"24","br_viewwidth":1918,"br_viewheight":982,"os_name":null,"os_family":null,"os_manufacturer":null,"os_timezone":"Europe/Berlin","dvce_type":null,"dvce_ismobile":0,"dvce_screenwidth":1920,"dvce_screenheight":1080,"doc_charset":"UTF-8","doc_width":1918,"doc_height":982,"tr_currency":null,"tr_total_base":null,"tr_tax_base":null,"tr_shipping_base":null,"ti_currency":null,"ti_price_base":null,"base_currency":null,"geo_timezone":"Europe/Berlin","mkt_clickid":null,"mkt_network":null,"etl_tags":null,"dvce_sent_tstamp":"2020-02-17 09:28:16.507","refr_domain_userid":null,"refr_dvce_tstamp":null,"derived_contexts":"{\"schema\":\"iglu:com.snowplowanalytics.snowplow/contexts/jsonschema/1-0-1\",\"data\":[{\"schema\":\"iglu:com.snowplowanalytics.snowplow/ua_parser_context/jsonschema/1-0-0\",\"data\":{\"useragentFamily\":\"Firefox\",\"useragentMajor\":\"72\",\"useragentMinor\":\"0\",\"useragentPatch\":null,\"useragentVersion\":\"Firefox 72.0\",\"osFamily\":\"Linux\",\"osMajor\":null,\"osMinor\":null,\"osPatch\":null,\"osPatchMinor\":null,\"osVersion\":\"Linux\",\"deviceFamily\":\"Other\"}}]}","domain_sessionid":"96958bf6-a8bf-4be8-9c67-fd957b6bc8d2","derived_tstamp":"2020-02-17 09:28:16.167","event_vendor":"com.snowplowanalytics.snowplow","event_name":"page_view","event_format":"jsonschema","event_version":"1-0-0","event_fingerprint":"5acdc8f85f9530081d1a71ec430c8756","true_tstamp":null},"raw":{"vendor":"com.snowplowanalytics.snowplow","version":"tp2","parameters":[{"name":"e","value":"pv"},{"name":"duid","value":"d6c468de-0aed-4785-9052-b6bb77b6dddb"},{"name":"vid","value":"13"},{"name":"eid","value":"2dfeb9b7-5a87-4214-8a97-a8b23176856b"},{"name":"url","value":"https://console.snowplowanalytics.com/"},{"name":"aid","value":"console"},{"name":"cx","value":"eyJzY2hlbWEiOiJpZ2x1OmNvbS5zbm93cGxvd2FuYWx5dGljcy5zbm93cGxvdy9jb250ZXh0cy9qc29uc2NoZW1hLzEtMC0wIiwiZGF0YSI6W3sic2NoZW1hIjoiaWdsdTpjb20uc25vd3Bsb3dhbmFseXRpY3Muc25vd3Bsb3cvd2ViX3BhZ2UvanNvbnNjaGVtYS8xLTAtMCIsImRhdGEiOnsiaWQiOiIzOWE5OTM0YS1kZGQzLTQ1ODEtYTRlYS1kMGJhMjBlNjNiOTIifX0seyJzY2hlbWEiOiJpZ2x1Om9yZy53My9QZXJmb3JtYW5jZVRpbWluZy9qc29uc2NoZW1hLzEtMC0wIiwiZGF0YSI6eyJuYXZpZ2F0aW9uU3RhcnQiOjE1ODE5MzE2OTQzOTcsInVubG9hZEV2ZW50U3RhcnQiOjE1ODE5MzE2OTYwNDYsInVubG9hZEV2ZW50RW5kIjoxNTgxOTMxNjk0NzY0LCJyZWRpcmVjdFN0YXJ0IjowLCJyZWRpcmVjdEVuZCI6MCwiZmV0Y2hTdGFydCI6MTU4MTkzMTY5NDM5NywiZG9tYWluTG9va3VwU3RhcnQiOjE1ODE5MzE2OTQ0NDAsImRvbWFpbkxvb2t1cEVuZCI6MTU4MTkzMTY5NDUxMywiY29ubmVjdFN0YXJ0IjoxNTgxOTMxNjk0NTEzLCJjb25uZWN0RW5kIjoxNTgxOTMxNjk0NjY1LCJzZWN1cmVDb25uZWN0aW9uU3RhcnQiOjE1ODE5MzE2OTQ1NzIsInJlcXVlc3RTdGFydCI6MTU4MTkzMTY5NDY2NSwicmVzcG9uc2VTdGFydCI6MTU4MTkzMTY5NDc1MCwicmVzcG9uc2VFbmQiOjE1ODE5MzE2OTQ3NTAsImRvbUxvYWRpbmciOjE1ODE5MzE2OTQ3NjIsImRvbUludGVyYWN0aXZlIjoxNTgxOTMxNjk1OTYzLCJkb21Db250ZW50TG9hZGVkRXZlbnRTdGFydCI6MTU4MTkzMTY5NjAzOSwiZG9tQ29udGVudExvYWRlZEV2ZW50RW5kIjoxNTgxOTMxNjk2MDM5LCJkb21Db21wbGV0ZSI6MCwibG9hZEV2ZW50U3RhcnQiOjAsImxvYWRFdmVudEVuZCI6MH19XX0"},{"name":"tna","value":"msc-gcp-stg1"},{"name":"cs","value":"UTF-8"},{"name":"cd","value":"24"},{"name":"page","value":"Snowplow BDP"},{"name":"stm","value":"1581931696507"},{"name":"tz","value":"Europe/Berlin"},{"name":"tv","value":"js-2.10.2"},{"name":"vp","value":"1918x982"},{"name":"ds","value":"1918x982"},{"name":"res","value":"1920x1080"},{"name":"cookie","value":"1"},{"name":"p","value":"web"},{"name":"dtm","value":"1581931696114"},{"name":"lang","value":"en-US"},{"name":"sid","value":"96958bf6-a8bf-4be8-9c67-fd957b6bc8d2"}],"contentType":"application/json","loaderName":"ssc-1.0.0-rc4-googlepubsub","encoding":"UTF-8","hostname":"gcp-sandbox-prod1.collector.snplow.net","timestamp":"2020-02-17T09:28:16.560Z","ipAddress":"18.194.133.57","useragent":"Mozilla/5.0 (X11; Linux x86_64; rv:72.0) Gecko/20100101 Firefox/72.0","refererUri":"https://console.snowplowanalytics.com/","headers":["Timeout-Access: ","Host: gcp-sandbox-prod1.collector.snplow.net","User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:72.0) Gecko/20100101 Firefox/72.0","Accept: */*","Accept-Language: en-US, en;q=0.5","Accept-Encoding: gzip, deflate, br","Origin: https://console.snowplowanalytics.com","Referer: https://console.snowplowanalytics.com/","Cookie: sp=510b2f05-27e3-4fd3-b449-a2702926da5e","X-Cloud-Trace-Context: 958285ba723e212998af29cec405e002/12535615945289151925","Via: 1.1 google","X-Forwarded-For: 18.194.133.57, 35.201.76.62","X-Forwarded-Proto: https","Connection: Keep-Alive","application/json"],"userId":"510b2f05-27e3-4fd3-b449-a2702926da5e"}}}}""" +val badrow = """{"schema":"iglu:com.snowplowanalytics.snowplow.badrows/enrichment_failures/jsonschema/1-0-0","data":{"processor":{"artifact":"beam-enrich","version":"1.0.0-rc5"},"failure":{"timestamp":"2020-02-17T09:28:18.100Z","messages":[{"enrichment":{"schemaKey":"iglu:com.snowplowanalytics.snowplow.enrichments/api_request_enrichment_config/jsonschema/1-0-0","identifier":"api-request"},"message":{"error":"Error accessing POJO input field [user]: [java.lang.NoSuchMethodException: com.snowplowanalytics.snowplow.enrich.common.outputs.EnrichedEvent.user_id-foo()]"}}]},"payload":{"enriched":{"app_id":"console","platform":"web","etl_tstamp":"2020-02-17 09:28:18.095","collector_tstamp":"2020-02-17 09:28:16.560","dvce_created_tstamp":"2020-02-17 09:28:16.114","event":"page_view","event_id":"2dfeb9b7-5a87-4214-8a97-a8b23176856b","txn_id":null,"name_tracker":"msc-gcp-stg1","v_tracker":"js-2.10.2","v_collector":"ssc-1.0.0-rc4-googlepubsub","v_etl":"beam-enrich-1.0.0-rc5-common-1.0.0","user_id":null,"user_ipaddress":"18.194.133.57","user_fingerprint":null,"domain_userid":"d6c468de-0aed-4785-9052-b6bb77b6dddb","domain_sessionidx":13,"network_userid":"510b2f05-27e3-4fd3-b449-a2702926da5e","geo_country":"DE","geo_region":"HE","geo_city":"Frankfurt am Main","geo_zipcode":"60313","geo_latitude":50.1188,"geo_longitude":8.6843,"geo_region_name":"Hesse","ip_isp":null,"ip_organization":null,"ip_domain":null,"ip_netspeed":null,"page_url":"https://console.snowplowanalytics.com/","page_title":"Snowplow CDI","page_referrer":null,"page_urlscheme":"https","page_urlhost":"console.snowplowanalytics.com","page_urlport":443,"page_urlpath":"/","page_urlquery":null,"page_urlfragment":null,"refr_urlscheme":null,"refr_urlhost":null,"refr_urlport":0,"refr_urlpath":null,"refr_urlquery":null,"refr_urlfragment":null,"refr_medium":null,"refr_source":null,"refr_term":null,"mkt_medium":null,"mkt_source":null,"mkt_term":null,"mkt_content":null,"mkt_campaign":null,"contexts":"{\"schema\":\"iglu:com.snowplowanalytics.snowplow/contexts/jsonschema/1-0-0\",\"data\":[{\"schema\":\"iglu:com.snowplowanalytics.snowplow/web_page/jsonschema/1-0-0\",\"data\":{\"id\":\"39a9934a-ddd3-4581-a4ea-d0ba20e63b92\"}},{\"schema\":\"iglu:org.w3/PerformanceTiming/jsonschema/1-0-0\",\"data\":{\"navigationStart\":1581931694397,\"unloadEventStart\":1581931696046,\"unloadEventEnd\":1581931694764,\"redirectStart\":0,\"redirectEnd\":0,\"fetchStart\":1581931694397,\"domainLookupStart\":1581931694440,\"domainLookupEnd\":1581931694513,\"connectStart\":1581931694513,\"connectEnd\":1581931694665,\"secureConnectionStart\":1581931694572,\"requestStart\":1581931694665,\"responseStart\":1581931694750,\"responseEnd\":1581931694750,\"domLoading\":1581931694762,\"domInteractive\":1581931695963,\"domContentLoadedEventStart\":1581931696039,\"domContentLoadedEventEnd\":1581931696039,\"domComplete\":0,\"loadEventStart\":0,\"loadEventEnd\":0}}]}","se_category":null,"se_action":null,"se_label":null,"se_property":null,"se_value":null,"unstruct_event":null,"tr_orderid":null,"tr_affiliation":null,"tr_total":null,"tr_tax":null,"tr_shipping":null,"tr_city":null,"tr_state":null,"tr_country":null,"ti_orderid":null,"ti_sku":null,"ti_name":null,"ti_category":null,"ti_price":null,"ti_quantity":0,"pp_xoffset_min":0,"pp_xoffset_max":0,"pp_yoffset_min":0,"pp_yoffset_max":0,"useragent":"Mozilla/5.0 (X11; Linux x86_64; rv:72.0) Gecko/20100101 Firefox/72.0","br_name":null,"br_family":null,"br_version":null,"br_type":null,"br_renderengine":null,"br_lang":"en-US","br_features_pdf":0,"br_features_flash":0,"br_features_java":0,"br_features_director":0,"br_features_quicktime":0,"br_features_realplayer":0,"br_features_windowsmedia":0,"br_features_gears":0,"br_features_silverlight":0,"br_cookies":1,"br_colordepth":"24","br_viewwidth":1918,"br_viewheight":982,"os_name":null,"os_family":null,"os_manufacturer":null,"os_timezone":"Europe/Berlin","dvce_type":null,"dvce_ismobile":0,"dvce_screenwidth":1920,"dvce_screenheight":1080,"doc_charset":"UTF-8","doc_width":1918,"doc_height":982,"tr_currency":null,"tr_total_base":null,"tr_tax_base":null,"tr_shipping_base":null,"ti_currency":null,"ti_price_base":null,"base_currency":null,"geo_timezone":"Europe/Berlin","mkt_clickid":null,"mkt_network":null,"etl_tags":null,"dvce_sent_tstamp":"2020-02-17 09:28:16.507","refr_domain_userid":null,"refr_dvce_tstamp":null,"derived_contexts":"{\"schema\":\"iglu:com.snowplowanalytics.snowplow/contexts/jsonschema/1-0-1\",\"data\":[{\"schema\":\"iglu:com.snowplowanalytics.snowplow/ua_parser_context/jsonschema/1-0-0\",\"data\":{\"useragentFamily\":\"Firefox\",\"useragentMajor\":\"72\",\"useragentMinor\":\"0\",\"useragentPatch\":null,\"useragentVersion\":\"Firefox 72.0\",\"osFamily\":\"Linux\",\"osMajor\":null,\"osMinor\":null,\"osPatch\":null,\"osPatchMinor\":null,\"osVersion\":\"Linux\",\"deviceFamily\":\"Other\"}}]}","domain_sessionid":"96958bf6-a8bf-4be8-9c67-fd957b6bc8d2","derived_tstamp":"2020-02-17 09:28:16.167","event_vendor":"com.snowplowanalytics.snowplow","event_name":"page_view","event_format":"jsonschema","event_version":"1-0-0","event_fingerprint":"5acdc8f85f9530081d1a71ec430c8756","true_tstamp":null},"raw":{"vendor":"com.snowplowanalytics.snowplow","version":"tp2","parameters":[{"name":"e","value":"pv"},{"name":"duid","value":"d6c468de-0aed-4785-9052-b6bb77b6dddb"},{"name":"vid","value":"13"},{"name":"eid","value":"2dfeb9b7-5a87-4214-8a97-a8b23176856b"},{"name":"url","value":"https://console.snowplowanalytics.com/"},{"name":"aid","value":"console"},{"name":"cx","value":"eyJzY2hlbWEiOiJpZ2x1OmNvbS5zbm93cGxvd2FuYWx5dGljcy5zbm93cGxvdy9jb250ZXh0cy9qc29uc2NoZW1hLzEtMC0wIiwiZGF0YSI6W3sic2NoZW1hIjoiaWdsdTpjb20uc25vd3Bsb3dhbmFseXRpY3Muc25vd3Bsb3cvd2ViX3BhZ2UvanNvbnNjaGVtYS8xLTAtMCIsImRhdGEiOnsiaWQiOiIzOWE5OTM0YS1kZGQzLTQ1ODEtYTRlYS1kMGJhMjBlNjNiOTIifX0seyJzY2hlbWEiOiJpZ2x1Om9yZy53My9QZXJmb3JtYW5jZVRpbWluZy9qc29uc2NoZW1hLzEtMC0wIiwiZGF0YSI6eyJuYXZpZ2F0aW9uU3RhcnQiOjE1ODE5MzE2OTQzOTcsInVubG9hZEV2ZW50U3RhcnQiOjE1ODE5MzE2OTYwNDYsInVubG9hZEV2ZW50RW5kIjoxNTgxOTMxNjk0NzY0LCJyZWRpcmVjdFN0YXJ0IjowLCJyZWRpcmVjdEVuZCI6MCwiZmV0Y2hTdGFydCI6MTU4MTkzMTY5NDM5NywiZG9tYWluTG9va3VwU3RhcnQiOjE1ODE5MzE2OTQ0NDAsImRvbWFpbkxvb2t1cEVuZCI6MTU4MTkzMTY5NDUxMywiY29ubmVjdFN0YXJ0IjoxNTgxOTMxNjk0NTEzLCJjb25uZWN0RW5kIjoxNTgxOTMxNjk0NjY1LCJzZWN1cmVDb25uZWN0aW9uU3RhcnQiOjE1ODE5MzE2OTQ1NzIsInJlcXVlc3RTdGFydCI6MTU4MTkzMTY5NDY2NSwicmVzcG9uc2VTdGFydCI6MTU4MTkzMTY5NDc1MCwicmVzcG9uc2VFbmQiOjE1ODE5MzE2OTQ3NTAsImRvbUxvYWRpbmciOjE1ODE5MzE2OTQ3NjIsImRvbUludGVyYWN0aXZlIjoxNTgxOTMxNjk1OTYzLCJkb21Db250ZW50TG9hZGVkRXZlbnRTdGFydCI6MTU4MTkzMTY5NjAzOSwiZG9tQ29udGVudExvYWRlZEV2ZW50RW5kIjoxNTgxOTMxNjk2MDM5LCJkb21Db21wbGV0ZSI6MCwibG9hZEV2ZW50U3RhcnQiOjAsImxvYWRFdmVudEVuZCI6MH19XX0"},{"name":"tna","value":"msc-gcp-stg1"},{"name":"cs","value":"UTF-8"},{"name":"cd","value":"24"},{"name":"page","value":"Snowplow CDI"},{"name":"stm","value":"1581931696507"},{"name":"tz","value":"Europe/Berlin"},{"name":"tv","value":"js-2.10.2"},{"name":"vp","value":"1918x982"},{"name":"ds","value":"1918x982"},{"name":"res","value":"1920x1080"},{"name":"cookie","value":"1"},{"name":"p","value":"web"},{"name":"dtm","value":"1581931696114"},{"name":"lang","value":"en-US"},{"name":"sid","value":"96958bf6-a8bf-4be8-9c67-fd957b6bc8d2"}],"contentType":"application/json","loaderName":"ssc-1.0.0-rc4-googlepubsub","encoding":"UTF-8","hostname":"gcp-sandbox-prod1.collector.snplow.net","timestamp":"2020-02-17T09:28:16.560Z","ipAddress":"18.194.133.57","useragent":"Mozilla/5.0 (X11; Linux x86_64; rv:72.0) Gecko/20100101 Firefox/72.0","refererUri":"https://console.snowplowanalytics.com/","headers":["Timeout-Access: ","Host: gcp-sandbox-prod1.collector.snplow.net","User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:72.0) Gecko/20100101 Firefox/72.0","Accept: */*","Accept-Language: en-US, en;q=0.5","Accept-Encoding: gzip, deflate, br","Origin: https://console.snowplowanalytics.com","Referer: https://console.snowplowanalytics.com/","Cookie: sp=510b2f05-27e3-4fd3-b449-a2702926da5e","X-Cloud-Trace-Context: 958285ba723e212998af29cec405e002/12535615945289151925","Via: 1.1 google","X-Forwarded-For: 18.194.133.57, 35.201.76.62","X-Forwarded-Proto: https","Connection: Keep-Alive","application/json"],"userId":"510b2f05-27e3-4fd3-b449-a2702926da5e"}}}}""" parse(badrow).foreach(println) ``` diff --git a/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/index.md b/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/index.md index 9e38d7773..f0faca1d8 100644 --- a/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/index.md +++ b/docs/data-product-studio/data-quality/failed-events/recovering-failed-events/manual/index.md @@ -1,11 +1,8 @@ --- -title: "Event Recovery for Community Edition" +title: "Event Recovery for Snowplow Self-Hosted" sidebar_label: "Manual recovery" sidebar_position: 40 -sidebar_custom_props: - offerings: - - community -description: "Low level tooling for manual event recovery in Community Edition." +description: "Low level tooling for manual event recovery in Snowplow Self-Hosted." --- Snowplow pipelines are "non-lossy", this means if something is wrong with an event during any part of the pipeline, the event is stored in a separate storage environment rather than just discarded. See the [failed events section](/docs/fundamentals/failed-events/index.md) for more information on the types of failures that may occur. diff --git a/docs/data-product-studio/data-quality/index.md b/docs/data-product-studio/data-quality/index.md index 3d0d46c12..35cd20fed 100644 --- a/docs/data-product-studio/data-quality/index.md +++ b/docs/data-product-studio/data-quality/index.md @@ -10,7 +10,7 @@ There are a number of ways you can test and QA your pipeline to follow good data When implementing new tracking, or when making changes to your schemas or enrichments, we recommend you run testing by sending events to a sandbox environment before deploying your changes to Production environments. -1. Find the sandbox endpoint in Snowplow Console (Snowplow BDP customers only) - this is accessible on the _Environments_ screen, as well as in the _'Testing details'_ dialog box on _Data Structures_ and _Enrichments_ screens. +1. Find the sandbox endpoint in Snowplow Console (Snowplow customers only) - this is accessible on the _Environments_ screen, as well as in the _'Testing details'_ dialog box on _Data Structures_ and _Enrichments_ screens. 2. Send a few events from your application to the sandbox endpoint. 3. Visit the OpenSearch Dashboard interface for your sandbox environment to check that your events have landed in the good queue (i.e. are valid) and that the data looks as you expect it to look (i.e. enriched appropriately, formatted and structured correctly). 4. Once you are happy that your changes are valid, you can deploy them to Production along with any application code. diff --git a/docs/data-product-studio/data-quality/snowplow-inspector/adding-schemas/index.md b/docs/data-product-studio/data-quality/snowplow-inspector/adding-schemas/index.md index 38d7f8901..5577ad164 100644 --- a/docs/data-product-studio/data-quality/snowplow-inspector/adding-schemas/index.md +++ b/docs/data-product-studio/data-quality/snowplow-inspector/adding-schemas/index.md @@ -69,12 +69,12 @@ These schemas are managed within the extension and stored on your own machine. The only required configuration is a name, and the schemas themselves. ## Data Structures API registries -This type is recommended for use with Snowplow BDP via the [Data Structures API](/docs/data-product-studio/data-structures/manage/api/index.md). +This type is recommended for use with the [Data Structures API](/docs/data-product-studio/data-structures/manage/api/index.md). In order to function, the extension requires: -- Organization ID: This is usually found in the URL when logged into the Snowplow BDP console. See more at [Managing Console API authentication](/docs/account-management/index.md#version-2). -- API Key: When logged into the Snowplow BDP console, should be available in [API keys for managing Snowplow](https://console.snowplowanalytics.com/credentials) (within "Manage organization"). See more at [Managing Console API authentication](/docs/account-management/index.md#version-2). +- Organization ID: find this under **Settings** > **Manage organization** in [Console](https://console.snowplowanalytics.com). +- API Key: go to **Settings** > **Manage organization** in Console to manage your API keys. ## Iglu Server registries [Iglu Server](/docs/api-reference/iglu/iglu-repositories/iglu-server/index.md) is a more full-featured dedicated service for hosting server that is more flexible than Static Registries. @@ -84,7 +84,7 @@ To authenticate with your server, the extension will require: - Iglu API Endpoint: This is the base URL the extension will use when contacting the API. If you include a path component, the API request will be made as `api/*`, relative to this path; you may need to add or remove trailing slashes if the API is not hosted at the root. - Iglu API Key: See [API keys and the authentication service](/docs/api-reference/iglu/iglu-repositories/iglu-server/index.md#5-api-keys-and-the-authentication-service-apiauth) for instructions on generating an Iglu API key. -When logged into the Snowplow BDP console, these details should be available in [API keys for utilities](https://console.snowplowanalytics.com/iglu-keys) (within "Manage organization"). +When logged into Snowplow Console, these details should be available in [API keys for utilities](https://console.snowplowanalytics.com/iglu-keys) (within "Manage organization"). ## Static registries Most other registries, such as those hosted as websites or via [S3](https://aws.amazon.com/s3/) or [GCS](https://cloud.google.com/products/storage/) buckets will use this ["static"](/docs/api-reference/iglu/iglu-repositories/static-repo/index.md) type. diff --git a/docs/data-product-studio/data-quality/snowplow-micro/adding-schemas/index.md b/docs/data-product-studio/data-quality/snowplow-micro/adding-schemas/index.md index 008a1e554..d654e2afe 100644 --- a/docs/data-product-studio/data-quality/snowplow-micro/adding-schemas/index.md +++ b/docs/data-product-studio/data-quality/snowplow-micro/adding-schemas/index.md @@ -34,7 +34,7 @@ trackSelfDescribingEvent({ For Micro to understand this event, it will need to know about `com.example/my-schema/jsonschema/1-0-0` or any other relevant schemas. There are two ways you can achieve this: -* **Point Micro to an Iglu registry that contains your schemas.** This is a good option if you use Snowplow BDP [UI](/docs/data-product-studio/data-structures/manage/index.md) or [API](/docs/data-product-studio/data-structures/manage/api/index.md) to create schemas, or if you have deployed your own Iglu registry. +* **Point Micro to an Iglu registry that contains your schemas.** This is a good option if you use Snowplow [Console](/docs/data-product-studio/data-structures/manage/index.md) UI or [API](/docs/data-product-studio/data-structures/manage/api/index.md) to create schemas, or if you have deployed your own Iglu registry. * **Add schemas to Micro directly.** This can be handy for quickly testing a schema. Whichever approach you choose, you can use the [the API](/docs/api-reference/snowplow-micro/api/index.md#microiglu) to check if Micro is able to reach your schemas (replace `com.example` and `my-schema` as appropriate). @@ -47,11 +47,11 @@ curl localhost:9090/micro/iglu/com.example/my-schema/jsonschema/1-0-0 Place your Iglu registry URL and API key (if any) into two [environment variables](https://en.wikipedia.org/wiki/Environment_variable): `MICRO_IGLU_REGISTRY_URL` and `MICRO_IGLU_API_KEY`. -Make sure to fully spell out the URL, including the protocol (`http://` or `https://`). For most Iglu registries, including those provided by Snowplow BDP, the URL will end with `/api` — make sure to include that part too, for example: `https://com-example.iglu.snplow.net/api`. [Static registries](/docs/api-reference/iglu/iglu-repositories/static-repo/index.md), such as `http://iglucentral.com`, are an exception — you don’t need to append `/api` to the URL. +Make sure to fully spell out the URL, including the protocol (`http://` or `https://`). For most Iglu registries, including those provided by Snowplow CDI, the URL will end with `/api` — make sure to include that part too, for example: `https://com-example.iglu.snplow.net/api`. [Static registries](/docs/api-reference/iglu/iglu-repositories/static-repo/index.md), such as `http://iglucentral.com`, are an exception — you don’t need to append `/api` to the URL. :::tip -In Snowplow BDP, you can find your Iglu registry URLs and generate API keys [via the console](https://console.snowplowanalytics.com/iglu-keys). +You can find your Iglu registry URLs and generate API keys [via the console](https://console.snowplowanalytics.com/iglu-keys). ::: diff --git a/docs/data-product-studio/data-structures/index.md b/docs/data-product-studio/data-structures/index.md index b43272c58..06efae4eb 100644 --- a/docs/data-product-studio/data-structures/index.md +++ b/docs/data-product-studio/data-structures/index.md @@ -5,7 +5,7 @@ sidebar_position: 3 --- This section explains how to create, manage, and update [data structures](/docs/fundamentals/schemas/index.md) (schemas). Snowplow provides different options for data structure management: -* Snowplow BDP Console UI +* Snowplow Console UI * Data structures API * [Snowplow CLI](/docs/data-product-studio/snowplow-cli/index.md) * For Community users: [Iglu](/docs/api-reference/iglu/iglu-repositories/iglu-server/index.md) diff --git a/docs/data-product-studio/data-structures/manage/api/index.md b/docs/data-product-studio/data-structures/manage/api/index.md index 91a352e6d..89b644b35 100644 --- a/docs/data-product-studio/data-structures/manage/api/index.md +++ b/docs/data-product-studio/data-structures/manage/api/index.md @@ -2,12 +2,9 @@ title: "Managing data structures via the API" sidebar_label: "Data structures API" sidebar_position: 3 -sidebar_custom_props: - offerings: - - bdp --- -As well as managing [data structures](/docs/fundamentals/schemas/index.md) through the Snowplow BDP Console, Snowplow BDP customers can also manage them programmatically through the data structures API. +As well as managing [data structures](/docs/fundamentals/schemas/index.md) through Snowplow Console, Snowplow customers can also manage them programmatically through the data structures API. This functionality is key to automating any existing process you may have, including workflows in version control systems like GitHub. diff --git a/docs/data-product-studio/data-structures/manage/builder/index.md b/docs/data-product-studio/data-structures/manage/builder/index.md index e8839a3f9..1995ddb08 100644 --- a/docs/data-product-studio/data-structures/manage/builder/index.md +++ b/docs/data-product-studio/data-structures/manage/builder/index.md @@ -3,9 +3,6 @@ title: "Managing data structures with the Data Structures Builder" description: "The Data Structures Builder is ideal for quickly creating an event or entity with our guided setup and automated versioning." sidebar_label: "Console: data structures builder" sidebar_position: 1 -sidebar_custom_props: - offerings: - - bdp --- :::info Supported properties diff --git a/docs/data-product-studio/data-structures/manage/cli/index.md b/docs/data-product-studio/data-structures/manage/cli/index.md index 37a4fa538..a16a97227 100644 --- a/docs/data-product-studio/data-structures/manage/cli/index.md +++ b/docs/data-product-studio/data-structures/manage/cli/index.md @@ -3,9 +3,6 @@ title: "Managing data structures via the CLI" description: "Use the 'snowplow-cli data-structures' command to manage your data structures." sidebar_label: "Snowplow CLI" sidebar_position: 2 -sidebar_custom_props: - offerings: - - bdp --- ```mdx-code-block @@ -55,7 +52,7 @@ The CLI download command only retrieves data structures that have been deployed ./snowplow-cli ds validate ./folder-name ``` -This command will find all files under `./folder-name` (if omitted then `./data-structures`) and attempt to validate them using BDP console. It will assert the following +This command will find all files under `./folder-name` (if omitted then `./data-structures`) and attempt to validate them using Snowplow Console. It will assert the following 1. Is each file a valid format (yaml/json) with expected fields 2. Does the schema in the file conform to [snowplow expectations](/docs/fundamentals/schemas/index.md#the-anatomy-of-a-schema) @@ -70,6 +67,6 @@ If any validations fail the command will report the problems to stdout and exit ./snowplow-cli ds publish dev ./folder-name ``` -This command will find all files under `./folder-name` (if omitted then `./data-structures`) and attempt to publish them to BDP console in the environment provided (`dev` or `prod`). +This command will find all files under `./folder-name` (if omitted then `./data-structures`) and attempt to publish them to Snowplow Console in the environment provided (`dev` or `prod`). Publishing to `dev` will also cause data structures to be validated with the `validate` command before upload. Publishing to `prod` will not validate but requires all data structures referenced to be present on `dev`. diff --git a/docs/data-product-studio/data-structures/manage/iglu/index.md b/docs/data-product-studio/data-structures/manage/iglu/index.md index 877a6ec25..64f79e70b 100644 --- a/docs/data-product-studio/data-structures/manage/iglu/index.md +++ b/docs/data-product-studio/data-structures/manage/iglu/index.md @@ -3,14 +3,11 @@ title: "Managing schemas using Iglu" date: "2020-02-15" sidebar_label: "Iglu" sidebar_position: 4 -sidebar_custom_props: - offerings: - - community --- ## Setup -To manage your [schemas](/docs/fundamentals/schemas/index.md), you will need to have an [Iglu Server](/docs/api-reference/iglu/iglu-repositories/iglu-server/index.md) installed (you will already have one if you followed the [Community Edition Quick Start](/docs/get-started/snowplow-community-edition/what-is-quick-start/index.md)). +To manage your [schemas](/docs/fundamentals/schemas/index.md), you will need to have an [Iglu Server](/docs/api-reference/iglu/iglu-repositories/iglu-server/index.md) installed (you will already have one if you followed the [Snowplow Self-Hosted Quick Start](/docs/get-started/self-hosted/index.md)). Alternatively, you can host a [static Iglu registry](/docs/api-reference/iglu/iglu-repositories/static-repo/index.md) in Amazon S3 or Google Cloud Storage. diff --git a/docs/data-product-studio/data-structures/manage/index.md b/docs/data-product-studio/data-structures/manage/index.md index 87dd0e964..b8a881b61 100644 --- a/docs/data-product-studio/data-structures/manage/index.md +++ b/docs/data-product-studio/data-structures/manage/index.md @@ -2,14 +2,11 @@ title: "Managing data structures" sidebar_position: 1 sidebar_label: "Manage" -sidebar_custom_props: - offerings: - - bdp --- import ThemedImage from '@theme/ThemedImage'; -To create a new [data structure](/docs/fundamentals/schemas/index.md) using Snowplow BDP Console, first navigate to **Data structures** in the menu and click the **Create a data structure** button. +To create a new [data structure](/docs/fundamentals/schemas/index.md) using Snowplow Console, first navigate to **Data structures** in the menu and click the **Create a data structure** button. ![](images/image-1.png) diff --git a/docs/data-product-studio/data-structures/manage/json-editor/index.md b/docs/data-product-studio/data-structures/manage/json-editor/index.md index dbfe62718..63312ca4e 100644 --- a/docs/data-product-studio/data-structures/manage/json-editor/index.md +++ b/docs/data-product-studio/data-structures/manage/json-editor/index.md @@ -3,9 +3,6 @@ title: "Managing data structures with the JSON Editor" description: "The JSON editor is best suited for defining complex data structures that require heavy nesting and advanced data types." sidebar_label: "Console: JSON editor" sidebar_position: 1 -sidebar_custom_props: - offerings: - - bdp --- :::info diff --git a/docs/data-product-studio/data-structures/version-amend/amending/index.md b/docs/data-product-studio/data-structures/version-amend/amending/index.md index bac21c974..7ed7801f2 100644 --- a/docs/data-product-studio/data-structures/version-amend/amending/index.md +++ b/docs/data-product-studio/data-structures/version-amend/amending/index.md @@ -44,8 +44,8 @@ flowchart LR ``` We call this approach “patching”. To patch the schema, i.e. apply changes to it without updating the version: -* If you are using Snowplow BDP, select the “Patch” option [in the UI](/docs/data-product-studio/data-structures/manage/index.md) when saving the schema -* If you are using Snowplow Community Edition, do not increment the schema version when [uploading it with `igluctl`](/docs/data-product-studio/data-structures/manage/iglu/index.md) +* If you are using Snowplow CDI, select the “Patch” option [in the UI](/docs/data-product-studio/data-structures/manage/index.md) when saving the schema +* If you are using Snowplow Self-Hosted, do not increment the schema version when [uploading it with `igluctl`](/docs/data-product-studio/data-structures/manage/iglu/index.md) :::danger @@ -55,7 +55,7 @@ Also, never patch a schema version that exists in a production environment, even ::: -For Snowplow BDP customers, patching is disabled for production pipelines. Community Edition users have to explicitly enable patching (if desired) in the [Iglu Server configuration](/docs/api-reference/iglu/iglu-repositories/iglu-server/reference/index.md) (`patchesAllowed`) at their own risk. +For Snowplow CDI customers, patching is disabled for production pipelines. Snowplow Self-Hosted users have to explicitly enable patching (if desired) in the [Iglu Server configuration](/docs/api-reference/iglu/iglu-repositories/iglu-server/reference/index.md) (`patchesAllowed`) at their own risk. :::tip Schema caching diff --git a/docs/data-product-studio/data-structures/version-amend/builder/index.md b/docs/data-product-studio/data-structures/version-amend/builder/index.md index f6228d0ff..5a5d6bd49 100644 --- a/docs/data-product-studio/data-structures/version-amend/builder/index.md +++ b/docs/data-product-studio/data-structures/version-amend/builder/index.md @@ -3,9 +3,6 @@ title: "Versioning Data Structures with the Data Structures Builder" date: "2023-03-01" sidebar_label: "Using the Data Structures Builder" sidebar_position: 20 -sidebar_custom_props: - offerings: - - bdp --- # Versioning with the Data Structures Builder diff --git a/docs/data-product-studio/data-structures/version-amend/enterprise/index.md b/docs/data-product-studio/data-structures/version-amend/enterprise/index.md index 07c61e28d..0b78f533e 100644 --- a/docs/data-product-studio/data-structures/version-amend/enterprise/index.md +++ b/docs/data-product-studio/data-structures/version-amend/enterprise/index.md @@ -2,9 +2,6 @@ title: "Versioning Data Structures with the JSON Editor" sidebar_label: "Using the JSON Editor" sidebar_position: 10 -sidebar_custom_props: - offerings: - - bdp --- ## How do I version? diff --git a/docs/data-product-studio/data-structures/version-amend/iglu/index.md b/docs/data-product-studio/data-structures/version-amend/iglu/index.md index 5bd31f05a..ef03f9116 100644 --- a/docs/data-product-studio/data-structures/version-amend/iglu/index.md +++ b/docs/data-product-studio/data-structures/version-amend/iglu/index.md @@ -2,9 +2,6 @@ title: "Versioning Data Structures using Iglu" sidebar_label: "Using Iglu" sidebar_position: 30 -sidebar_custom_props: - offerings: - - community --- ## How do I version? diff --git a/docs/data-product-studio/event-specifications/api/index.md b/docs/data-product-studio/event-specifications/api/index.md index 66d7f503f..9795ea75d 100644 --- a/docs/data-product-studio/event-specifications/api/index.md +++ b/docs/data-product-studio/event-specifications/api/index.md @@ -2,9 +2,6 @@ title: "Managing Event Specifications via the API" sidebar_label: "Using the API" sidebar_position: 1 -sidebar_custom_props: - offerings: - - bdp --- With the [**Event Specifications API**](https://console.snowplowanalytics.com/api/msc/v1/docs), you can efficiently manage event specifications programmatically. Whether you want to retrieve, create, edit, publish, deprecate, or delete event specifications, the API provides the necessary endpoints and functionalities. diff --git a/docs/data-product-studio/event-specifications/index.md b/docs/data-product-studio/event-specifications/index.md index 96d40200c..945af88f2 100644 --- a/docs/data-product-studio/event-specifications/index.md +++ b/docs/data-product-studio/event-specifications/index.md @@ -2,9 +2,6 @@ title: "Managing Event Specifications in the Console" sidebar_label: "Event specifications" sidebar_position: 4 -sidebar_custom_props: - offerings: - - bdp --- ## Creating a new Event Specification through Console diff --git a/docs/data-product-studio/event-specifications/tracking-plans/index.md b/docs/data-product-studio/event-specifications/tracking-plans/index.md index 8953c42ff..8b0e20e31 100644 --- a/docs/data-product-studio/event-specifications/tracking-plans/index.md +++ b/docs/data-product-studio/event-specifications/tracking-plans/index.md @@ -2,9 +2,6 @@ title: "Creating a tracking plan with event specifications" sidebar_label: "Creating a tracking plan" sidebar_position: 2 -sidebar_custom_props: - offerings: - - bdp --- As explained in [Introduction to tracking design](/docs/data-product-studio/index.md), to use Snowplow successfully, you need to have a good idea of: @@ -53,5 +50,5 @@ The event validation part allows you to set the instructions for tracking implem The [Entities](/docs/fundamentals/entities/index.md) part allows you to declare which entities should be tracked with the event. You can also define whether the entity should be mandatory or optional, or whether more than one instance should be tracked with this event. :::info -Snowplow BDP provides both a UI and an API to manage your Event Specifications. For information about managing event specifications see [How to manage Event Specifications](/docs/data-product-studio/event-specifications/index.md). +Snowplow provides both a UI and an API to manage your Event Specifications. For information about managing event specifications see [How to manage Event Specifications](/docs/data-product-studio/event-specifications/index.md). ::: diff --git a/docs/data-product-studio/index.md b/docs/data-product-studio/index.md index 0e267bc70..6b05647e3 100644 --- a/docs/data-product-studio/index.md +++ b/docs/data-product-studio/index.md @@ -9,7 +9,7 @@ sidebar_custom_props: Data Product Studio is a set of tooling for designing and implementing behavioral data event tracking, including schemas (data structures), ownership, observability, and code generation. The tools help you improve data quality, and allow you to add a data contracts/governance guarantee. -The Data Product Studio UI is included in the Snowplow BDP Console. +The Data Product Studio UI is included in Snowplow Console. ## Tracking design basics @@ -40,7 +40,7 @@ This is where creating a **Tracking Plan** comes into play. It is a comprehensiv - Other relevant information. :::info -Snowplow BDP customers can create tracking plans directly in Snowplow instead of using an external document. See [Creating tracking plans](/docs/data-product-studio/event-specifications/tracking-plans/index.md) for more information. +Snowplow customers can create tracking plans directly in Snowplow instead of using an external document. See [Creating tracking plans](/docs/data-product-studio/event-specifications/tracking-plans/index.md) for more information. ::: Snowplow also uses a **schema registry** to store the definition of these data structures. @@ -153,7 +153,7 @@ These questions may help when defining your events: * When should the events happen? What are the triggers of the events? :::note Event Specifications -The last two questions above can be captured using [event specifications in BDP Enterprise and Cloud](https://snowplow.io/blog/tracking-scenarios-release/). +The last two questions above can be captured using [event specifications](https://snowplow.io/blog/tracking-scenarios-release/). ::: A common challenge in defining event schemas is the choice of their granularity. diff --git a/docs/data-product-studio/snowplow-cli/index.md b/docs/data-product-studio/snowplow-cli/index.md index 407f74ea3..7e1f52eb4 100644 --- a/docs/data-product-studio/snowplow-cli/index.md +++ b/docs/data-product-studio/snowplow-cli/index.md @@ -59,7 +59,7 @@ You can also use optional flags: If you prefer manual configuration, you will need these values: -* An API Key ID and the corresponding API Key (secret), which are generated from the [credentials section](https://console.snowplowanalytics.com/credentials) in BDP Console. +* An API Key ID and the corresponding API Key (secret), which are generated from the [credentials section](https://console.snowplowanalytics.com/credentials) in Console. * Your Organization ID, which you can find [on the _Manage organization_ page](https://console.snowplowanalytics.com/settings) in Console. Snowplow CLI can take its configuration from a variety of sources. More details are available from `./snowplow-cli data-structures --help`. Variations on these three examples should serve most cases. diff --git a/docs/data-product-studio/snowplow-cli/reference/index.md b/docs/data-product-studio/snowplow-cli/reference/index.md index b95502dfc..dcbd5a03c 100644 --- a/docs/data-product-studio/snowplow-cli/reference/index.md +++ b/docs/data-product-studio/snowplow-cli/reference/index.md @@ -20,10 +20,10 @@ Work with Snowplow data products ### Options ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id -h, --help help for data-products - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id ``` @@ -82,8 +82,8 @@ snowplow-cli data-products add-event-spec {path} [flags] ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -95,7 +95,7 @@ snowplow-cli data-products add-event-spec {path} [flags] Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -108,11 +108,11 @@ snowplow-cli data-products add-event-spec {path} [flags] ## Data-Products Download -Download all data products, event specs and source apps from BDP Console +Download all data products, event specs and source apps from Snowplow Console ### Synopsis -Downloads the latest versions of all data products, event specs and source apps from BDP Console. +Downloads the latest versions of all data products, event specs and source apps from Snowplow Console. If no directory is provided then defaults to 'data-products' in the current directory. Source apps are stored in the nested 'source-apps' directory @@ -138,8 +138,8 @@ snowplow-cli data-products download {directory ./data-products} [flags] ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -151,7 +151,7 @@ snowplow-cli data-products download {directory ./data-products} [flags] Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -203,8 +203,8 @@ snowplow-cli data-products generate [paths...] [flags] ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -216,7 +216,7 @@ snowplow-cli data-products generate [paths...] [flags] Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -229,11 +229,11 @@ snowplow-cli data-products generate [paths...] [flags] ## Data-Products Publish -Publish all data products, event specs and source apps to BDP Console +Publish all data products, event specs and source apps to Snowplow Console ### Synopsis -Publish the local version versions of all data products, event specs and source apps from BDP Console. +Publish the local version versions of all data products, event specs and source apps from Snowplow Console. If no directory is provided then defaults to 'data-products' in the current directory. Source apps are stored in the nested 'source-apps' directory @@ -260,8 +260,8 @@ snowplow-cli data-products publish {directory ./data-products} [flags] ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -273,7 +273,7 @@ snowplow-cli data-products publish {directory ./data-products} [flags] Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -315,8 +315,8 @@ snowplow-cli data-products purge {directory ./data-products} [flags] ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -328,7 +328,7 @@ snowplow-cli data-products purge {directory ./data-products} [flags] Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -341,11 +341,11 @@ snowplow-cli data-products purge {directory ./data-products} [flags] ## Data-Products Validate -Validate data products and source applications with BDP Console +Validate data products and source applications with Snowplow Console ### Synopsis -Sends all data products and source applications from \ for validation by BDP Console. +Sends all data products and source applications from \ for validation by Snowplow Console. ``` snowplow-cli data-products validate [paths...] [flags] @@ -370,8 +370,8 @@ snowplow-cli data-products validate [paths...] [flags] ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -383,7 +383,7 @@ snowplow-cli data-products validate [paths...] [flags] Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -409,10 +409,10 @@ Work with Snowplow data structures ### Options ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id -h, --help help for data-structures - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id ``` @@ -441,11 +441,11 @@ Work with Snowplow data structures ## Data-Structures Download -Download all data structures from BDP Console +Download all data structures from Snowplow Console ### Synopsis -Downloads the latest versions of all data structures from BDP Console. +Downloads the latest versions of all data structures from Snowplow Console. Will retrieve schema contents from your development environment. If no directory is provided then defaults to 'data-structures' in the current directory. @@ -486,8 +486,8 @@ snowplow-cli data-structures download {directory ./data-structures} [flags] ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -499,7 +499,7 @@ snowplow-cli data-structures download {directory ./data-structures} [flags] Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -553,8 +553,8 @@ snowplow-cli data-structures generate login_click {directory ./data-structures} ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -566,7 +566,7 @@ snowplow-cli data-structures generate login_click {directory ./data-structures} Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -585,7 +585,7 @@ Publishing commands for data structures Publishing commands for data structures -Publish local data structures to BDP console. +Publish local data structures to Snowplow Console. ### Options @@ -597,8 +597,8 @@ Publish local data structures to BDP console. ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -610,7 +610,7 @@ Publish local data structures to BDP console. Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -627,11 +627,11 @@ Publish data structures to your development environment ### Synopsis -Publish modified data structures to BDP Console and your development environment +Publish modified data structures to Snowplow Console and your development environment -The 'meta' section of a data structure is not versioned within BDP Console. +The 'meta' section of a data structure is not versioned within Snowplow Console. Changes to it will be published by this command. - + ``` snowplow-cli data-structures publish dev [paths...] default: [./data-structures] [flags] @@ -656,8 +656,8 @@ snowplow-cli data-structures publish dev [paths...] default: [./data-structures] ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -669,7 +669,7 @@ snowplow-cli data-structures publish dev [paths...] default: [./data-structures] Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -690,7 +690,7 @@ Publish data structures from your development to your production environment Data structures found on \ which are deployed to your development environment will be published to your production environment. - + ``` snowplow-cli data-structures publish prod [paths...] default: [./data-structures] [flags] @@ -703,7 +703,7 @@ snowplow-cli data-structures publish prod [paths...] default: [./data-structures $ snowplow-cli ds publish prod $ snowplow-cli ds publish prod --dry-run $ snowplow-cli ds publish prod --dry-run ./my-data-structures ./my-other-data-structures - + ``` ### Options @@ -716,8 +716,8 @@ snowplow-cli data-structures publish prod [paths...] default: [./data-structures ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -729,7 +729,7 @@ snowplow-cli data-structures publish prod [paths...] default: [./data-structures Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -742,11 +742,11 @@ snowplow-cli data-structures publish prod [paths...] default: [./data-structures ## Data-Structures Validate -Validate data structures with BDP Console +Validate data structures with Snowplow Console ### Synopsis -Sends all data structures from \ for validation by BDP Console. +Sends all data structures from \ for validation by Snowplow Console. ``` snowplow-cli data-structures validate [paths...] default: [./data-structures] [flags] @@ -769,8 +769,8 @@ snowplow-cli data-structures validate [paths...] default: [./data-structures] [f ### Options inherited from parent commands ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml Then on: Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml @@ -782,7 +782,7 @@ snowplow-cli data-structures validate [paths...] default: [./data-structures] [f Unix $HOME/.config/snowplow/.env Darwin $HOME/Library/Application Support/snowplow/.env Windows %AppData%\snowplow\.env - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") --json-output Log output as json -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id @@ -856,12 +856,12 @@ Setup options: ### Options ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --base-directory string The base path to use for relative file lookups. Useful for clients that pass in relative file paths. --dump-context Dumps the result of the get_context tool to stdout and exits. -h, --help help for mcp - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id ``` @@ -910,13 +910,13 @@ snowplow-cli setup [flags] ### Options ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id --auth0-domain string Auth0 domain (default "id.snowplowanalytics.com") --client-id string Auth0 Client ID for device auth (default "EXQ3csSDr6D7wTIiebNPhXpgkSsOzCzi") --dotenv Store as .env file in current working directory -h, --help help for setup - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id --read-only Create a read-only API key @@ -965,10 +965,10 @@ snowplow-cli status [flags] ### Options ``` - -S, --api-key string BDP console api key - -a, --api-key-id string BDP console api key id + -S, --api-key string Snowplow Console api key + -a, --api-key-id string Snowplow Console api key id -h, --help help for status - -H, --host string BDP console host (default "https://console.snowplowanalytics.com") + -H, --host string Snowplow Console host (default "https://console.snowplowanalytics.com") -m, --managed-from string Link to a github repo where the data structure is managed -o, --org-id string Your organization id ``` @@ -991,6 +991,3 @@ snowplow-cli status [flags] -q, --quiet Log output level to Warn -s, --silent Disable output ``` - - - diff --git a/docs/data-product-studio/snowtype/index.md b/docs/data-product-studio/snowtype/index.md index a141965d2..e5e42daf9 100644 --- a/docs/data-product-studio/snowtype/index.md +++ b/docs/data-product-studio/snowtype/index.md @@ -2,9 +2,6 @@ title: "Snowtype (Code generation) - automatically generate code for Snowplow tracking SDKs" sidebar_position: 6 sidebar_label: "Snowtype (Code generation)" -sidebar_custom_props: - offerings: - - bdp --- ```mdx-code-block diff --git a/docs/data-product-studio/snowtype/snowtype-config/index.md b/docs/data-product-studio/snowtype/snowtype-config/index.md index de05e8cd5..e6f84c5f5 100644 --- a/docs/data-product-studio/snowtype/snowtype-config/index.md +++ b/docs/data-product-studio/snowtype/snowtype-config/index.md @@ -35,7 +35,7 @@ The Data Product IDs you wish to generate tracking code for. By providing the Da ### `organizationId` -The Organization ID for your Snowplow BDP account. The Organization ID is a UUID that can be retrieved from the URL immediately following the .com when visiting console. +The Organization ID for your Snowplow account. The Organization ID is a UUID that can be retrieved from the URL immediately following the .com when visiting console. ### `tracker` diff --git a/docs/data-product-studio/snowtype/using-the-cli/index.md b/docs/data-product-studio/snowtype/using-the-cli/index.md index 27233a67a..41468538e 100644 --- a/docs/data-product-studio/snowtype/using-the-cli/index.md +++ b/docs/data-product-studio/snowtype/using-the-cli/index.md @@ -62,7 +62,7 @@ npx @snowplow/snowtype@latest init ``` The input required for the initialization to work, it the following: -- The organization ID from the BDP console. +- The organization ID from Snowplow Console. - The [tracker](./index.md#available-trackerslanguages) you wish to generate code for. - _If applicable,_ the language for that tracker. - The output path you wish the CLI to generate the code to. diff --git a/docs/data-product-studio/source-applications/index.md b/docs/data-product-studio/source-applications/index.md index 4e3c9ecf1..41697360e 100644 --- a/docs/data-product-studio/source-applications/index.md +++ b/docs/data-product-studio/source-applications/index.md @@ -19,7 +19,7 @@ This will let you best manage changes you make to the available Application Cont To illustrate, let's consider Snowplow. We can identify several applications designed for distinct purposes, each serving as a separate data source for behavioral data, or in other words, a Source Application: - The Snowplow website that corresponds to the application served under www.snowplow.io -- The BDP Console application that is served under console.snowplowanalytics.com. +- The Console application that is served under console.snowplowanalytics.com. - The documentation website serving as our information hub, for all things related to our product, served under docs.snowplow.io. Source Applications are a foundational component that enables you to establish the overarching relationships that connect application IDs and [Application Entites](/docs/sources/trackers/web-trackers/custom-tracking-using-schemas/global-context/index.md) and [Data Products](/docs/data-product-studio/data-products/index.md). diff --git a/docs/destinations/forwarding-events/event-forwarding-monitoring-and-troubleshooting/index.md b/docs/destinations/forwarding-events/event-forwarding-monitoring-and-troubleshooting/index.md index d0b34b897..9f41067a5 100644 --- a/docs/destinations/forwarding-events/event-forwarding-monitoring-and-troubleshooting/index.md +++ b/docs/destinations/forwarding-events/event-forwarding-monitoring-and-troubleshooting/index.md @@ -1,6 +1,6 @@ --- title: "Monitoring and troubleshooting forwarders" -description: "Monitor event forwarder performance, debug failures, and understand retry logic with cloud metrics, failed event logs, and BDP Console statistics." +description: "Monitor event forwarder performance, debug failures, and understand retry logic with cloud metrics, failed event logs, and Console statistics." sidebar_position: 15 --- @@ -52,7 +52,7 @@ Once a forwarder is deployed, you can configure one or more email addresses to s You can monitor forwarders in a few ways: -- **Console metrics**: you can view high-level delivery statistics in BDP Console. +- **Console metrics**: you can view high-level delivery statistics in Console. - **Cloud monitoring metrics**: forwarders emit a set of metrics to your cloud provider's observability service. - **Failed event logs**: for failed deliveries, Snowplow saves detailed logs to your cloud storage bucket. Consume these logs for automated monitoring in your observability platform of choice. @@ -67,7 +67,7 @@ To view these metrics, navigate to **Destinations** > **Destinations list** and ### Cloud monitoring metrics :::info -Forwarder cloud metrics are only available for [BDP Enterprise](/docs/get-started/snowplow-bdp/index.md#enterprise-in-your-own-cloud) customers. +Forwarder cloud metrics are only available for [CDI Private Managed Cloud](/docs/get-started/index.md#cdi-private-managed-cloud) customers. ::: Forwarders emit the following metrics in your cloud provider's monitoring service: diff --git a/docs/destinations/forwarding-events/google-tag-manager-server-side/index.md b/docs/destinations/forwarding-events/google-tag-manager-server-side/index.md index 9e9ef234b..74e07fb16 100644 --- a/docs/destinations/forwarding-events/google-tag-manager-server-side/index.md +++ b/docs/destinations/forwarding-events/google-tag-manager-server-side/index.md @@ -22,14 +22,14 @@ GTM SS with Snowplow can be setup in two different configurations. ### Destinations Hub (Post-pipeline) -Use GTM SS to relay enriched events to destinations. Events are sent to GTM SS via Snowbridge after being processed by your Snowplow Pipeline. +Use GTM SS to relay enriched events to destinations. Events are sent to GTM SS via Snowbridge after being processed by your Snowplow pipeline. -* For Community Edition, see [Snowbridge](/docs/api-reference/snowbridge/index.md). -* For Snowplow BDP, you can [request setup](https://console.snowplowanalytics.com/destinations/catalog) through the Console. +* For Snowplow CDI, you can [request setup](https://console.snowplowanalytics.com/destinations/catalog) through Console +* For Snowplow Self-Hosted, see [Snowbridge](/docs/api-reference/snowbridge/index.md) :::note -Destinations Hub is the recommended way to setup GTM Server-side because it allows you to take full advantage of the Snowplow pipeline, and forward validated and enriched data to downstream destinations. +Destinations Hub is the recommended way to set up GTM Server-Side because it allows you to take full advantage of the Snowplow pipeline, and forward validated and enriched data to downstream destinations. ::: @@ -39,7 +39,7 @@ Use GTM SS to relay raw events before the Snowplow pipeline to destinations, inc ### Principles for AWS deployment -:::info For Snowplow BDP customers +:::info For Snowplow customers GTM SS **should** be deployed into a different account to the Snowplow sub-account to maintain full segmentation of the infrastructure that Snowplow manages from that which is managed by the Snowplow customer. diff --git a/docs/destinations/forwarding-events/google-tag-manager-server-side/testing/index.md b/docs/destinations/forwarding-events/google-tag-manager-server-side/testing/index.md index c041537e0..0ef653d52 100644 --- a/docs/destinations/forwarding-events/google-tag-manager-server-side/testing/index.md +++ b/docs/destinations/forwarding-events/google-tag-manager-server-side/testing/index.md @@ -10,7 +10,7 @@ You can direct some (or all) of your Snowplow events to the Preview Mode, instea :::note Snowbridge 2.3+ -To follow the steps below, you will need to be running [Snowbridge](/docs/api-reference/snowbridge/index.md) 2.3+. You will also need to have the [`spGtmssPreview` transformation](/docs/api-reference/snowbridge/configuration/transformations/builtin/spGtmssPreview.md) activated (this is the default for Snowplow BDP customers using Snowbridge with GTM Server Side). +To follow the steps below, you will need to be running [Snowbridge](/docs/api-reference/snowbridge/index.md) 2.3+. You will also need to have the [`spGtmssPreview` transformation](/docs/api-reference/snowbridge/configuration/transformations/builtin/spGtmssPreview.md) activated (this is the default for Snowplow customers using Snowbridge with GTM Server Side). ::: diff --git a/docs/destinations/forwarding-events/images/event-forwarding-diagram.drawio.svg b/docs/destinations/forwarding-events/images/event-forwarding-diagram.drawio.svg index 6906d5393..0491abf48 100644 --- a/docs/destinations/forwarding-events/images/event-forwarding-diagram.drawio.svg +++ b/docs/destinations/forwarding-events/images/event-forwarding-diagram.drawio.svg @@ -1,4 +1,4 @@ -
Snowplow BDP
Web
Pipeline
Collect
Validate
Enrich
Mobile App
Server-side
Sources
Marketing Automation
Product Analytics
Advertising Platforms
Event
Forwarders
Warehouse
Loaders
Data
Warehouse
Destinations
Streaming Infra
\ No newline at end of file +
Snowplow CDI
Web
Pipeline
Collect
Validate
Enrich
Mobile App
Server-side
Sources
Marketing Automation
Product Analytics
Advertising Platforms
Event
Forwarders
Warehouse
Loaders
Data
Warehouse
Destinations
Streaming Infra
\ No newline at end of file diff --git a/docs/destinations/forwarding-events/reference/index.md b/docs/destinations/forwarding-events/reference/index.md index 6f00e5b22..bdaf2674c 100644 --- a/docs/destinations/forwarding-events/reference/index.md +++ b/docs/destinations/forwarding-events/reference/index.md @@ -4,7 +4,7 @@ description: "Complete reference for Snowplow Event Forwarding JavaScript expres sidebar_position: 20 --- -Event forwarders use JavaScript expressions for filtering events and mapping Snowplow data to destination fields. These expressions are entered during [forwarder setup](/docs/destinations/forwarding-events/index.md#getting-started) in BDP Console, specifically in the **Event filtering**, **Field mapping**, and **Custom functions** sections. This reference covers the syntax and available data for these operations. +Event forwarders use JavaScript expressions for filtering events and mapping Snowplow data to destination fields. These expressions are entered during [forwarder setup](/docs/destinations/forwarding-events/index.md#getting-started) in Console, specifically in the **Event filtering**, **Field mapping**, and **Custom functions** sections. This reference covers the syntax and available data for these operations. ## Available event fields diff --git a/docs/fundamentals/failed-events/index.md b/docs/fundamentals/failed-events/index.md index 63c737995..7a02b9f23 100644 --- a/docs/fundamentals/failed-events/index.md +++ b/docs/fundamentals/failed-events/index.md @@ -39,7 +39,7 @@ Other failures generally fall into 3 categories: * **Bots or malicious activity**. Bots, vulnerability scans, and so on, can send completely invalid events to the Collector. The format might be wrong, or the payload size might be extraordinarily large. -* **Pipeline misconfiguration**. For example, a loader could be reading from the wrong stream (with events in the wrong format). This is quite rare, especially for Snowplow BDP, where all relevant pipeline configuration is automatic. +* **Pipeline misconfiguration**. For example, a loader could be reading from the wrong stream (with events in the wrong format). This is quite rare, unless you're self-hosting Snowplow, as all relevant pipeline configuration is automatic. * **Temporary infrastructure issue**. This is again rare. One example would be Iglu Server (schema registry) not being available. @@ -51,7 +51,7 @@ All of these are internal failures you typically can’t address upstream. ## Dealing with failed events -Snowplow BDP provides a dashboard and alerts for failed events. See [Monitoring failed events](/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md). +Snowplow provides a dashboard and alerts for failed events. See [Monitoring failed events](/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md). --- diff --git a/docs/fundamentals/schemas/index.md b/docs/fundamentals/schemas/index.md index a5e084f2c..e76aab057 100644 --- a/docs/fundamentals/schemas/index.md +++ b/docs/fundamentals/schemas/index.md @@ -9,7 +9,7 @@ description: "Schemas are a powerful feature that ensures your data is clean and :::info Terminology -We often use the terms “schema” and “data structure” interchangeably, although in [Snowplow BDP](/docs/get-started/feature-comparison/index.md), “data structure” usually refers to a combination of a schema with some additional metadata. +We often use the terms “schema” and “data structure” interchangeably, although “data structure” usually refers to a combination of a schema with some additional metadata. ::: @@ -48,12 +48,12 @@ We use the terms “schema repository” and “schema registry” interchangeab There is a [central Iglu repository](http://iglucentral.com/) that holds public schemas for use with Snowplow, including ones for some of the [out-of-the-box self-described events](/docs/fundamentals/events/index.md#out-of-the-box-and-custom-events) and [out-of-the-box entities](/docs/fundamentals/entities/index.md#out-of-the-box-entities). -To host schemas for your [custom self-described events](/docs/fundamentals/events/index.md#self-describing-events) and [custom entities](/docs/fundamentals/entities/index.md#custom-entities), you can run your own Iglu, either using [Iglu Server](/docs/api-reference/iglu/iglu-repositories/iglu-server/index.md) (recommended), or a [static repository](/docs/api-reference/iglu/iglu-repositories/static-repo/index.md). _(This is not required for Snowplow BDP customers, as Snowplow BDP already includes a private Iglu repository.)_ +To host schemas for your [custom self-described events](/docs/fundamentals/events/index.md#self-describing-events) and [custom entities](/docs/fundamentals/entities/index.md#custom-entities), you can run your own Iglu, either using [Iglu Server](/docs/api-reference/iglu/iglu-repositories/iglu-server/index.md) (recommended), or a [static repository](/docs/api-reference/iglu/iglu-repositories/static-repo/index.md). _(This is not required for Snowplow CDI customers, as Snowplow already includes a private Iglu repository.)_ ## The anatomy of a schema :::info -BDP customers can create custom schemas using the [Data Structures Builder](/docs/data-product-studio/data-structures/manage/builder/index.md) without worrying about how it works under the hood. +Snowplow CDI customers can create custom schemas using the [Data Structures Builder](/docs/data-product-studio/data-structures/manage/builder/index.md) without worrying about how it works under the hood. ::: Snowplow schemas are based on the [JSON Schema](https://json-schema.org/) standard ([draft 4](https://datatracker.ietf.org/doc/html/draft-fge-json-schema-validation-00)). For a comprehensive guide to all Snowplow supported validation options, see the [Snowplow JSON Schema reference](/docs/fundamentals/schemas/json-schema-reference/index.md). Let’s take a look at an example schema to talk about its constituent parts: diff --git a/docs/get-started/feature-comparison/index.md b/docs/get-started/feature-comparison/index.md index ec0d4f6e1..81c75fbf0 100644 --- a/docs/get-started/feature-comparison/index.md +++ b/docs/get-started/feature-comparison/index.md @@ -5,73 +5,73 @@ hide_table_of_contents: true sidebar_label: "Feature comparison" --- -Here is a detailed list of product features, including which are available as part of the Snowplow Behavioral Data Platform (BDP) or [Snowplow Community Edition](/docs/get-started/snowplow-community-edition/index.md). +Here is a detailed list of product features, showing which are available as part of Snowplow [Customer Data Infrastructure](/docs/get-started/index.md#customer-data-infrastructure) (CDI) or [Snowplow Self-Hosted](/docs/get-started/index.md#self-hosted). -|

Data Pipeline

| BDP | [Community Edition](/docs/get-started/snowplow-community-edition/index.md) | -| :------------------------------------------------------------------------------------------------------------------------ | :---: | :------------------------------------------------------------------------: | -| [35+ trackers and webhooks](/docs/sources/index.md) | ✅ | ✅ | -| First party tracking | ✅ | ✅ | -| Anonymous data collection | ✅ | ✅ | -| [Cookie Extension service](/docs/events/cookie-extension/index.md) | ✅ | ✅ | -| High availability and auto-scaling | ✅ | ❌ | -| [Enrichments](/docs/pipeline/enrichments/available-enrichments/index.md) | ✅ | ✅ | -| [Failed events](/docs/fundamentals/failed-events/index.md) | ✅ | ✅ | -| [Data quality monitoring](/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md) | ✅ | ❌ | -| Single Sign-On | ✅ | ❌ | -| Pipeline observability | ✅ | do-it-yourself | -| Surge protection | ✅ | do-it-yourself | +|

Data Pipeline

| CDI | Self-Hosted | +| :------------------------------------------------------------------------------------------------------------------------ | :---: | :--------------: | +| [25+ trackers and webhooks](/docs/sources/index.md) | ✅ | ✅ | +| First party tracking | ✅ | ✅ | +| Anonymous data collection | ✅ | ✅ | +| [Cookie Extension service](/docs/events/cookie-extension/index.md) | ✅ | ✅ | +| High availability and auto-scaling | ✅ | ❌ | +| [Enrichments](/docs/pipeline/enrichments/available-enrichments/index.md) | ✅ | ✅ | +| [Failed events](/docs/fundamentals/failed-events/index.md) | ✅ | ✅ | +| [Data quality monitoring](/docs/data-product-studio/data-quality/failed-events/monitoring-failed-events/index.md) | ✅ | ❌ | +| Single Sign-On | ✅ | ❌ | +| Pipeline observability | ✅ | do-it-yourself | +| Surge protection | ✅ | do-it-yourself | | **Warehouse / lake destinations** | | -| • Snowflake | ✅ | ✅ | -| • Redshift | ✅ | ✅ | -| • BigQuery | ✅ | ✅ | -| • Databricks | ✅ | ✅ | -| • Synapse Analytics | ✅ | ✅ | -| • Elasticsearch | ✅ | ✅ | -| • S3 | ✅ | ✅ | -| • Google Cloud Storage | ✅ | ✅ | -| • Azure Data Lake Storage / OneLake | ✅ | ✅ | +| • Snowflake | ✅ | ✅ | +| • Redshift | ✅ | ✅ | +| • BigQuery | ✅ | ✅ | +| • Databricks | ✅ | ✅ | +| • Synapse Analytics | ✅ | ✅ | +| • Elasticsearch | ✅ | ✅ | +| • S3 | ✅ | ✅ | +| • Google Cloud Storage | ✅ | ✅ | +| • Azure Data Lake Storage / OneLake | ✅ | ✅ | | **Real-time streams** | | -| • Kinesis | ✅ | ✅ | -| • PubSub | ✅ | ✅ | -| • Kafka / Azure Event Hubs / Confluent Cloud | ✅ | ✅ | -|

Data Product Studio

| BDP | [Community Edition](/docs/get-started/snowplow-community-edition/index.md) | -| Advanced enrichments (PII, IP anonymization, JS, API, SQL enrichments) | ✅ | ✅ (no UI or API) | -| [Data structures tooling and API](/docs/data-product-studio/data-structures/manage/index.md) | ✅ | ❌ | -| [Data products](/docs/data-product-studio/data-products/index.md) | ✅ | ❌ | -| [Data modeling management tooling](/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md) | ✅ | ❌ | -| Jobs monitoring dashboard | ✅ | ❌ | -| Failed events alerting | ✅ | ❌ | -| Failed events in the warehouse | ✅ | ❌ | -| QA pipeline | ✅ | do-it-yourself | -| Fine-grained user permissions using access control lists | ✅ | ❌ | -| API key access | ✅ | ❌ | -|

[Data Model Packs](/docs/modeling-your-data/visualization/index.md)

| BDP | [Community Edition](/docs/get-started/snowplow-community-edition/index.md) | -| **Digital Analytics** | | | -| Funnel builder | ✅ | ❌ | -| User and Marketing Analytics | ✅ | ❌ | -| Marketing Attribution | ✅ | ❌ | -| Video and Media | ✅ | ❌ | -| [Unified Digital, Attribution, Media Player, Ecommerce, Normalize, Utils data models](/docs/modeling-your-data/index.md) | ✅ | ❌ | -| **Ecommerce Analytics** | | | -| Ecommerce Analytics | ✅ | ❌ | -| [Ecommerce data model](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-models/dbt-ecommerce-data-model/index.md) | ✅ | ❌ | -|

Extensions

| BDP | [Community Edition](/docs/get-started/snowplow-community-edition/index.md) | -| Reverse ETL, powered by Census | ✅ | ❌ | -| Audience Hub, powered by Census | ✅ | ❌ | -|

Performance and Resilience

| BDP | [Community Edition](/docs/get-started/snowplow-community-edition/index.md) | -| Outage Protection | ✅ | ❌ | -| Global Availability | ✅ | ❌ | -|

Infrastructure and Security

| BDP | [Community Edition](/docs/get-started/snowplow-community-edition/index.md) | -| **High** | | | -| HTTP access controls | ✅ | ❌ | -| VPC peering | ✅ | ❌ | -| SSH access control | ✅ | ❌ | -| CVE reporting | ✅ | ❌ | -| Static collector IPs | ✅ | ❌ | -| **Advanced** | | | -| Custom VPC integration | ✅ | ❌ | -| Custom IAM policy | ✅ | ❌ | -| Custom security agents | ✅ | ❌ | -|

SLAs

| BDP | [Community Edition](/docs/get-started/snowplow-community-edition/index.md) | -| Collector uptime SLA | ✅ | ❌ | -| Warehouse loading latency SLA | ✅ | ❌ | +| • Kinesis | ✅ | ✅ | +| • PubSub | ✅ | ✅ | +| • Kafka / Azure Event Hubs / Confluent Cloud | ✅ | ✅ | +|

Data Product Studio

| CDI | Self-Hosted | +| Advanced enrichments (PII, IP anonymization, JS, API, SQL enrichments) | ✅ | ✅ (no UI or API) | +| [Data structures tooling and API](/docs/data-product-studio/data-structures/manage/index.md) | ✅ | ❌ | +| [Data products](/docs/data-product-studio/data-products/index.md) | ✅ | ❌ | +| [Data modeling management tooling](/docs/modeling-your-data/running-data-models-via-console/dbt/index.md) | ✅ | ❌ | +| Jobs monitoring dashboard | ✅ | ❌ | +| Failed events alerting | ✅ | ❌ | +| Failed events in the warehouse | ✅ | ❌ | +| QA pipeline | ✅ | do-it-yourself | +| Fine-grained user permissions using access control lists | ✅ | ❌ | +| API key access | ✅ | ❌ | +|

[Data Model Packs](/docs/modeling-your-data/visualization/index.md)

| CDI | Self-Hosted | +| **Digital Analytics** | | | +| Funnel builder | ✅ | ❌ | +| User and Marketing Analytics | ✅ | ❌ | +| Marketing Attribution | ✅ | ❌ | +| Video and Media | ✅ | ❌ | +| [Unified Digital, Attribution, Media Player, Ecommerce, Normalize, Utils data models](/docs/modeling-your-data/index.md) | ✅ | ❌ | +| **Ecommerce Analytics** | | | +| Ecommerce Analytics | ✅ | ❌ | +| [Ecommerce data model](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-models/dbt-ecommerce-data-model/index.md) | ✅ | ❌ | +|

Extensions

| CDI | Self-Hosted | +| Reverse ETL, powered by Census | ✅ | ❌ | +| Audience Hub, powered by Census | ✅ | ❌ | +|

Performance and Resilience

| CDI | Self-Hosted | +| Outage Protection | ✅ | ❌ | +| Global Availability | ✅ | ❌ | +|

Infrastructure and Security

| CDI | Self-Hosted | +| **High** | | | +| HTTP access controls | ✅ | ❌ | +| VPC peering | ✅ | ❌ | +| SSH access control | ✅ | ❌ | +| CVE reporting | ✅ | ❌ | +| Static collector IPs | ✅ | ❌ | +| **Advanced** | | | +| Custom VPC integration | ✅ | ❌ | +| Custom IAM policy | ✅ | ❌ | +| Custom security agents | ✅ | ❌ | +|

SLAs

| CDI | Self-Hosted | +| Collector uptime SLA | ✅ | ❌ | +| Warehouse loading latency SLA | ✅ | ❌ | diff --git a/docs/get-started/index.md b/docs/get-started/index.md index 3ac0974e9..2037e5602 100644 --- a/docs/get-started/index.md +++ b/docs/get-started/index.md @@ -5,13 +5,25 @@ sidebar_label: "Get started" description: "Details on where and how Snowplow is deployed" --- -For production use, you can choose between Snowplow BDP Enterprise and Snowplow BDP Cloud. For non-production use cases, use Snowplow Community Edition, or play around with Snowplow Micro. See the [feature comparison page](/docs/get-started/feature-comparison/index.md) for more information. +Choose the [Snowplow](https://snowplow.io) platform that works for your business. See the [feature comparison page](/docs/get-started/feature-comparison/index.md) for more information. -## BDP Enterprise +We offer two fully featured Customer Data Infrastructure (CDI) platforms: +* **Snowplow CDI Private Managed Cloud**: hosted in your own cloud, managed by Snowplow +* **Snowplow CDI Cloud**: hosted and managed by Snowplow -Snowplow [BDP Enterprise](/docs/get-started/snowplow-bdp/index.md) is deployed using a "private SaaS" or "Bring Your Own Cloud (BYOC)" deployment model. This means the data pipeline is hosted and run in your own cloud environment, using your data warehouse or lake. These comprise the **data plane**. Ongoing pipeline maintenance, such as upgrades and security patches, are managed by Snowplow. +For self-hosted deployments, we have: +* **Snowplow Community Edition**: not for production use, hosted and managed by you +* **Snowplow Self-Hosted**: for production use, for existing Snowplow users -The **control plane**, which includes a UI and an API for [defining your data](/docs/data-product-studio/data-products/index.md) and managing your infrastructure, is hosted by Snowplow. +## Customer Data Infrastructure + +Snowplow CDI is our full infrastructure offering. Choose whether you'd like the **data plane** to be entirely hosted in your cloud account, or whether you'd prefer Snowplow to host the pipeline infrastructure for you. + +The **control plane**, which includes a UI and an API for [defining your data](/docs/data-product-studio/data-products/index.md) and managing your infrastructure, is always hosted by Snowplow. + +### CDI Private Managed Cloud + +Private Managed Cloud is a version of [Snowplow](https://snowplow.io) hosted in your own cloud account, using your data warehouse or lake. Ongoing pipeline maintenance, such as upgrades and security patches, are managed by Snowplow. | | Hosted by Snowplow | Hosted by you | | :------------------------------------------ | :----------------: | :-----------: | @@ -22,9 +34,11 @@ The **control plane**, which includes a UI and an API for [defining your data](/ | Pipeline infrastructure (AWS / Azure / GCP) | | ✅ | | Data destination (warehouse / lake) | | ✅ | -## BDP Cloud +### CDI Cloud + +Cloud is a hosted version of Snowplow designed to get your organization up and running and delivering value from behavioral data as quickly as possible. With Cloud, you don't need to set up any cloud infrastructure yourself. -Snowplow [BDP Cloud](/docs/get-started/snowplow-bdp/index.md) differs from BDP Enterprise in that your data pipeline is deployed in Snowplow's cloud account, and is entirely managed by Snowplow. +All data processed and collected with Snowplow Cloud is undertaken within Snowplow's own cloud account. | | Hosted by Snowplow | Hosted by you | | :------------------------------------------ | :----------------: | :-----------: | @@ -35,9 +49,15 @@ Snowplow [BDP Cloud](/docs/get-started/snowplow-bdp/index.md) differs from BDP E | Pipeline infrastructure (AWS / Azure / GCP) | ✅ | | | Data destination (warehouse / lake) | | ✅ | -## Community Edition +## Self-Hosted -With Snowplow [Community Edition](/docs/get-started/snowplow-community-edition/index.md), you deploy and host everything. Note that the **control plane** functionality isn't available in Community Edition. +With Self-Hosted, you deploy and host everything. Many features, including the **control plane**, are not available in Self-Hosted. + +### Community Edition + +Snowplow [Community Edition](/docs/get-started/self-hosted/index.md) is for **non-production** use cases. It's a starter template: use it to evaluate Snowplow for testing purposes. + +Community Edition infrastructure is provided under the [SLULA license](/docs/resources/copyright-license/index.md). | | Hosted by Snowplow | Hosted by you | | :------------------------------------------ | :----------------: | :-----------: | @@ -48,6 +68,15 @@ With Snowplow [Community Edition](/docs/get-started/snowplow-community-edition/i | Pipeline infrastructure (AWS / Azure / GCP) | | ✅ | | Data destination (warehouse / lake) | | ✅ | -## Micro +### Production Self-Hosted license + +If you have an existing Snowplow implementation, either Community Edition or a legacy deployment, you're eligible for Snowplow Self-Hosted. It's a production-use license. -While not a full substitute for a Snowplow pipeline, [Snowplow Micro](/docs/data-product-studio/data-quality/snowplow-micro/index.md) could be a quick way to get a feel for how Snowplow works for more technical users. Micro doesn't store data in any warehouse or database, but you will be able to look at the available fields. +| | Hosted by Snowplow | Hosted by you | +| :------------------------------------------ | :----------------: | :-----------: | +| **Control plane** | | | +| Management Console | N/A | N/A | +| API endpoints | N/A | N/A | +| **Data plane** | | | +| Pipeline infrastructure (AWS / Azure / GCP) | | ✅ | +| Data destination (warehouse / lake) | | ✅ | diff --git a/docs/get-started/modeling/index.md b/docs/get-started/modeling/index.md index 3fa63195f..b47238872 100644 --- a/docs/get-started/modeling/index.md +++ b/docs/get-started/modeling/index.md @@ -10,35 +10,14 @@ import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; ``` -Querying the events table directly — as you would have done in the previous step — can be useful for exploring your events or building custom analytics. However, for many common use cases it’s much easier to use our [data models](/docs/modeling-your-data/modeling-your-data-with-dbt/index.md), which provide a pre-aggregated view of your data. +Querying the events table directly — as you would have done in the previous step — can be useful for exploring your events or building custom analytics. However, for many common use cases it's much easier to use our [data models](/docs/modeling-your-data/modeling-your-data-with-dbt/index.md), which provide a pre-aggregated view of your data. -We recommend [dbt](https://www.getdbt.com/) for data modeling. Here’s how to get started. +We recommend [dbt](https://www.getdbt.com/) for data modeling. Refer to the [setup instructions](/docs/modeling-your-data/running-data-models-via-console/index.md) to add and configure your models in [Snowplow Console](https://console.snowplowanalytics.com), so that they can be run automatically by Snowplow. - - +The [Unified Digital model](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-models/dbt-unified-data-model/index.md) is a good starting point for websites and/or mobile applications. It provides data about page and screen views, sessions, users, and more. You can also explore the [full list of available models](/docs/modeling-your-data/modeling-your-data-with-dbt/index.md). -Refer to the [setup instructions](/docs/modeling-your-data/running-data-models-via-snowplow-bdp/index.md) to add and configure your models in the Console, so that they can be run automatically by Snowplow BDP. - - - - -Refer to the [setup instructions](/docs/modeling-your-data/running-data-models-via-snowplow-bdp/index.md) to add and configure your models in the Console, so that they can be run automatically by Snowplow BDP. - - - - -You will need to install dbt and run the models yourself — see the “quick start” links below. - - - - -Next, add your first model: -* The [Unified Digital model](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-models/dbt-unified-data-model/index.md) is a good starting point for websites and/or mobile applications, providing data about page and screen views, sessions, users, and more ([quick start guide](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-quickstart/unified/index.md)) - -You can also explore the [full list of available models](/docs/modeling-your-data/modeling-your-data-with-dbt/index.md). +If you don't have access to Snowplow Console, you'll need to install dbt and run the models yourself. Check out our [Unified Digital Quick Start tutorial](/tutorials/unified-digital/intro) for help getting started. :::tip Using dbt - To start using our models with dbt, you will need to [create a dbt project](https://docs.getdbt.com/reference/commands/init) and [add the respective packages](https://docs.getdbt.com/docs/build/packages). - ::: diff --git a/docs/get-started/private-managed-cloud/index.md b/docs/get-started/private-managed-cloud/index.md new file mode 100644 index 000000000..2435c4cb7 --- /dev/null +++ b/docs/get-started/private-managed-cloud/index.md @@ -0,0 +1,20 @@ +--- +title: "Setting up Snowplow CDI" +sidebar_position: 3 +sidebar_label: "Setting up Snowplow CDI" +--- + +To get started with Snowplow Customer Data Infrastructure, follow the **Getting Started** steps in [Snowplow Console](https://console.snowplowanalytics.com/getting-started). You will receive an account as part of your onboarding. + +## CDI Cloud + +If you have a Snowplow [CDI Cloud](/docs/get-started/index.md#cdi-cloud) account, we'll set up your infrastructure for you. Check out **Pipelines** in Console to see your new pipeline. + +## CDI Private Managed Cloud + +If you have a Snowplow [CDI Private Managed Cloud](/docs/get-started/index.md#cdi-private-managed-cloud) account, the Getting Started steps will guide you through setting up your cloud environment. You can also find the instructions here: +* [AWS Setup Guide](/docs/get-started/private-managed-cloud/setup-guide-aws/index.md) +* [Azure Setup Guide](/docs/get-started/private-managed-cloud/setup-guide-azure/index.md) +* [GCP Setup Guide](/docs/get-started/private-managed-cloud/setup-guide-gcp/index.md) + +Once you've set up your cloud environment, go to **Pipelines** in Console to request your new pipeline. diff --git a/docs/get-started/snowplow-bdp/setup-guide-aws/images/aws_logo_0.png b/docs/get-started/private-managed-cloud/setup-guide-aws/images/aws_logo_0.png similarity index 100% rename from docs/get-started/snowplow-bdp/setup-guide-aws/images/aws_logo_0.png rename to docs/get-started/private-managed-cloud/setup-guide-aws/images/aws_logo_0.png diff --git a/docs/get-started/snowplow-bdp/setup-guide-aws/index.md b/docs/get-started/private-managed-cloud/setup-guide-aws/index.md similarity index 90% rename from docs/get-started/snowplow-bdp/setup-guide-aws/index.md rename to docs/get-started/private-managed-cloud/setup-guide-aws/index.md index af8baeefb..b06b24550 100644 --- a/docs/get-started/snowplow-bdp/setup-guide-aws/index.md +++ b/docs/get-started/private-managed-cloud/setup-guide-aws/index.md @@ -1,11 +1,11 @@ --- -title: "BDP Enterprise on AWS" +title: "Private Managed Cloud on AWS" date: "2020-01-30" sidebar_position: 10 coverImage: "aws_logo_0.png" --- -To set up Snowplow, simply follow the ['Getting Started' steps in the Snowplow BDP Console](https://console.snowplowanalytics.com/getting-started). You will receive an account as part of your onboarding. +To set up Snowplow, follow the ['Getting Started' steps in the Snowplow Console](https://console.snowplowanalytics.com/getting-started). You will receive an account as part of your onboarding. ## What are the steps @@ -25,11 +25,11 @@ To set up your cloud environment as required you will need: - to know which AWS region you’d like us to install your Snowplow pipeline into - to know whether or not you will need to use an existing VPC or require VPC peering (note: VPC peering and using a custom VPC are additional bolt-ons) -We often find our point of contact requires support from their DevOps or Networking colleagues to complete the cloud setup step; in Snowplow BDP Console you can [easily create accounts for colleagues](/docs/account-management/managing-users/index.md) who can complete this step for you. +We often find our point of contact requires support from their DevOps or Networking colleagues to complete the cloud setup step; in Console you can [easily create accounts for colleagues](/docs/account-management/managing-users/index.md) who can complete this step for you. ## Preparing your AWS sub-account -These instructions are also provided as part of the setup flow in Snowplow BDP Console. +These instructions are also provided as part of the setup flow in Console. ### Create sub-account @@ -42,11 +42,11 @@ These instructions are also provided as part of the setup flow in Snowplow BDP C 1. Access the IAM control panel within the sub-account 2. Go to Access management > Roles and select Create role -3. Select "Another AWS account" (Account ID: 793733611312 Require MFA: false). We use Okta to assume roles, which uses delegated MFA and not direct MFA authentication to AWS +3. Select "Another AWS account" (Account ID: 793733611312 Require MFA: false). We use Okta to assume roles, which uses delegated MFA and not direct MFA authentication to AWS 4. Select the policy you created earlier 5. Call the role "SnowplowAdmin" (please use this specific name) -You will need to share this role with us as part of filling out the setup form in Snowplow BDP Console. +You will need to share this role with us as part of filling out the setup form in Console. ### JSON Policy Document @@ -140,9 +140,9 @@ The last step is to set up the Snowplow deployment role. This is a role assumed - Do not select Require MFA as Snowplow needs to be able to assume the role via headless jobs - If setting this up via IAM, do not add `"aws:MultiFactorAuthPresent": "false"` condition, as this will prevent the role being assumed by Snowplow SRE staff. We use Okta to assume roles, which uses delegated MFA and not direct MFA authentication to AWS 3. Attach the `IAMFullAccess` policy. If a Permission Boundary was set on the admin role, then add this boundary to the bottom section of permissions page. -- Role name: SnowplowDeployment (please use this specific name) -- Role description: Allows the Snowplow Team to programmatically deploy to this account. -4. Copy the Snowplow deployment role ARN. You will need to share this role with us as part of filling out the setup form in Snowplow BDP console. +- Role name: `SnowplowDeployment` (please use this specific name) +- Role description: allows the Snowplow Team to programmatically deploy to this account. +4. Copy the Snowplow deployment role ARN. You will need to share this role with us as part of filling out the setup form in Console. ### Provide a CIDR range for VPC peering or using a custom VPC (optional) diff --git a/docs/get-started/snowplow-bdp/setup-guide-azure/images/azure_role_assignment_conditions.png b/docs/get-started/private-managed-cloud/setup-guide-azure/images/azure_role_assignment_conditions.png similarity index 100% rename from docs/get-started/snowplow-bdp/setup-guide-azure/images/azure_role_assignment_conditions.png rename to docs/get-started/private-managed-cloud/setup-guide-azure/images/azure_role_assignment_conditions.png diff --git a/docs/get-started/snowplow-bdp/setup-guide-azure/index.md b/docs/get-started/private-managed-cloud/setup-guide-azure/index.md similarity index 93% rename from docs/get-started/snowplow-bdp/setup-guide-azure/index.md rename to docs/get-started/private-managed-cloud/setup-guide-azure/index.md index b529c0969..1c23e28e9 100644 --- a/docs/get-started/snowplow-bdp/setup-guide-azure/index.md +++ b/docs/get-started/private-managed-cloud/setup-guide-azure/index.md @@ -1,15 +1,13 @@ --- -title: "BDP Enterprise on Azure" +title: "Private Managed Cloud on Azure" sidebar_position: 30 --- -## Request your pipeline through Snowplow BDP Console - -Login to Snowplow BDP Console where you'll be able to follow a step-by-step guide to getting started (including the steps below). +To set up Snowplow, log in to Snowplow [Console](https://console.snowplowanalytics.com) where you'll be able to follow a step-by-step guide to getting started, including the steps below. ## Setting up your Azure account -To get your cloud environment ready for your Snowplow pipeline to be installed: +The following steps explain how to set up your cloud environment ready for your Snowplow pipeline to be installed. ### Create a new subscription @@ -23,7 +21,7 @@ Enable billing in the tenant by creating a subscription. Otherwise, the pipeline Snowplow deploys into your tenant using a verified [application service principal](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (Enterprise application). We require a custom role to be assigned to the application service principal. This will allow us to create custom pipeline roles needed for deploying and managing different components of your infrastructure. -#### Consent to Snowplow BDP Enterprise Deployment application +#### Consent to Snowplow Private Managed Cloud Deployment application You will need to grant our verified application service principal the access into your Azure tenant. Once that’s done, you should see the application service principal under _Microsoft Entra ID_ → _Enterprise Applications_. @@ -33,7 +31,7 @@ You will need to grant our verified application service principal the access int https://login.microsoftonline.com//oauth2/authorize?client_id=0581feb4-b614-42c7-b8e7-b4e7fba9153a&response_type=code ``` 3. A consent window will appear detailing that an Enterprise application is being set up in your tenant. It needs to be accepted by your Azure tenant admin for the organization (there is a tick box that must be ticked). After accepting, Microsoft redirects you to a page unrelated to the Azure Portal, so close this window -4. Verify the trust has been established by viewing “Snowplow BDP Enterprise Deployment” application in the Enterprise Applications section of Entra ID +4. Verify the trust has been established by viewing “Snowplow Private Managed Cloud Deployment” application in the Enterprise Applications section of Entra ID #### Create and assign role to application service principal diff --git a/docs/get-started/snowplow-bdp/setup-guide-gcp/images/gcp_logo.png b/docs/get-started/private-managed-cloud/setup-guide-gcp/images/gcp_logo.png similarity index 100% rename from docs/get-started/snowplow-bdp/setup-guide-gcp/images/gcp_logo.png rename to docs/get-started/private-managed-cloud/setup-guide-gcp/images/gcp_logo.png diff --git a/docs/get-started/snowplow-bdp/setup-guide-gcp/index.md b/docs/get-started/private-managed-cloud/setup-guide-gcp/index.md similarity index 89% rename from docs/get-started/snowplow-bdp/setup-guide-gcp/index.md rename to docs/get-started/private-managed-cloud/setup-guide-gcp/index.md index 8866c8ae9..b796825ef 100644 --- a/docs/get-started/snowplow-bdp/setup-guide-gcp/index.md +++ b/docs/get-started/private-managed-cloud/setup-guide-gcp/index.md @@ -1,17 +1,15 @@ --- -title: "BDP Enterprise on GCP" +title: "Private Managed Cloud on GCP" date: "2020-01-30" sidebar_position: 30 coverImage: "gcp_logo.png" --- -## Request your pipeline through Snowplow BDP Console - -Login to Snowplow BDP Console where you'll be able to follow a step-by-step guide to getting started (including the steps below). +To set up Snowplow, log in to Snowplow [Console](https://console.snowplowanalytics.com) where you'll be able to follow a step-by-step guide to getting started, including the steps below. ## Setting up your Google project -To set up your cloud environment ready for your Snowplow pipeline to be installed: +The following steps explain how to set up your cloud environment ready for your Snowplow pipeline to be installed. ### Create a new project diff --git a/docs/get-started/querying/index.md b/docs/get-started/querying/index.md index e45c67d01..fa30c2759 100644 --- a/docs/get-started/querying/index.md +++ b/docs/get-started/querying/index.md @@ -10,22 +10,15 @@ import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; ``` -Once you’ve tracked some events, you will want to look at them in your data warehouse or database. The exact steps will depend on your choice of storage and the Snowplow offering. +Once you've tracked some events, you'll want to look at them in your data warehouse, database, or lake. The exact steps will depend on your choice of storage and your Snowplow platform. -## Connection details +For **Snowplow CDI** Private Managed Cloud or Cloud customers, you can find your connection details in [Snowplow Console](https://console.snowplowanalytics.com/destinations/catalog), under the destination you've selected. - - +Follow [our querying guide](/docs/destinations/warehouses-lakes/querying-data/index.md) for advice on querying your data. -You can find the connection details in the [Console](https://console.snowplowanalytics.com/destinations/catalog), under the destination you’ve selected. +## Snowplow Self-Hosted - - - -You can find the connection details in the [Console](https://console.snowplowanalytics.com/destinations/catalog), under the destination you’ve selected. - - - +If you don't have access to Snowplow Console, follow these instructions to connect to your Snowplow data: @@ -85,7 +78,7 @@ To connect, you can use either Snowflake dashboard or [SnowSQL](https://docs.sno :::info Azure-specific instructions -On Azure, you have created an external table in the [last step of the guide](/docs/get-started/snowplow-community-edition/quick-start/index.md#configure-the-destination). Use this table and ignore the text below. +On Azure, you have created an external table in the [last step of the guide](/docs/get-started/self-hosted/quick-start/index.md#configure-the-destination). Use this table and ignore the text below. ::: @@ -102,20 +95,13 @@ See the [Databricks tutorial](https://docs.databricks.com/getting-started/quick- In Synapse Analytics, you can connect directly to the data residing in ADLS. You will need to know the names of the storage account (set in the `storage_account_name` Terraform variable) and the storage container (it’s a fixed value: `lake-container`). -Follow [the Synapse documentation](https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/query-delta-lake-format) and use the `OPENROWSET` function. If you created a data source in the [last step](/docs/get-started/snowplow-community-edition/quick-start/index.md#configure-the-destination) of the quick start guide, your queries will be a bit simpler. +Follow [the Synapse documentation](https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/query-delta-lake-format) and use the `OPENROWSET` function. If you created a data source in the [last step](/docs/get-started/self-hosted/quick-start/index.md#configure-the-destination) of the quick start guide, your queries will be a bit simpler. :::tip Fabric and OneLake -If you created a OneLake shortcut in the [last step](/docs/get-started/snowplow-community-edition/quick-start/index.md#configure-the-destination) of the quick start guide, you will be able to explore Snowplow data in Fabric, for example, using Spark SQL. +If you created a OneLake shortcut in the [last step](/docs/get-started/self-hosted/quick-start/index.md#configure-the-destination) of the quick start guide, you will be able to explore Snowplow data in Fabric, for example, using Spark SQL. ::: - - - - -## Writing queries - -Follow [our querying guide](/docs/destinations/warehouses-lakes/querying-data/index.md) for more information. diff --git a/docs/get-started/snowplow-community-edition/_diagram.md b/docs/get-started/self-hosted/_diagram.md similarity index 100% rename from docs/get-started/snowplow-community-edition/_diagram.md rename to docs/get-started/self-hosted/_diagram.md diff --git a/docs/get-started/snowplow-community-edition/_license-notice.md b/docs/get-started/self-hosted/_license-notice.md similarity index 100% rename from docs/get-started/snowplow-community-edition/_license-notice.md rename to docs/get-started/self-hosted/_license-notice.md diff --git a/docs/get-started/snowplow-community-edition/faq/index.md b/docs/get-started/self-hosted/faq/index.md similarity index 98% rename from docs/get-started/snowplow-community-edition/faq/index.md rename to docs/get-started/self-hosted/faq/index.md index bb955b7e2..b2ad3d4d1 100644 --- a/docs/get-started/snowplow-community-edition/faq/index.md +++ b/docs/get-started/self-hosted/faq/index.md @@ -1,5 +1,5 @@ --- -title: "Community Edition quick start: frequently asked questions" +title: "Snowplow Self-Hosted quick start: frequently asked questions" sidebar_label: "Frequently asked questions" sidebar_position: 5 --- diff --git a/docs/get-started/snowplow-community-edition/what-is-quick-start/index.md b/docs/get-started/self-hosted/index.md similarity index 77% rename from docs/get-started/snowplow-community-edition/what-is-quick-start/index.md rename to docs/get-started/self-hosted/index.md index bd8487435..da4061f55 100644 --- a/docs/get-started/snowplow-community-edition/what-is-quick-start/index.md +++ b/docs/get-started/self-hosted/index.md @@ -1,9 +1,16 @@ --- -title: "What to expect from the quick start guide" -sidebar_label: "Before you begin" -sidebar_position: 1 +title: "Setting up Snowplow Self-Hosted" +date: "2020-10-30" +sidebar_position: 4 +sidebar_label: "Setting up Self-Hosted" --- +```mdx-code-block +import LicenseNotice from '@site/docs/get-started/self-hosted/_license-notice.md'; +``` + +This page is an introduction for the [Self-Hosted](/docs/get-started/index.md#self-hosted) [Quick Start guide](docs/get-started/self-hosted/quick-start/index.md). Follow the Quick Start to set up a self-hosted deployment. + We have built a set of [terraform](https://www.terraform.io/docs/language/modules/develop/index.html) modules, which automates the setup and deployment of the required infrastructure and applications for an operational Snowplow Community Edition pipeline, with just a handful of input variables required on your side. After following this guide, you will be able to:  @@ -13,7 +20,7 @@ After following this guide, you will be able to:  - Easily enable and disable our suite of [out-of-the-box enrichments](/docs/pipeline/enrichments/available-enrichments/index.md) - Consume your rich data from the data warehouse, database, lake and/or real-time stream -Here’s some key information. + ## Required time @@ -36,5 +43,3 @@ Out of the box, the deployed pipeline will handle up to ~100 events per second ( ## Getting help Check out our [Community](https://community.snowplow.io/). If you run into any problems or have any questions, we are here to help. - -If you are interested in receiving the latest updates from Product & Engineering, such as critical bug fixes, security updates, new features and the rest, then [join our mailing list](https://info.snowplow.io/newsletter-signup). diff --git a/docs/get-started/snowplow-community-edition/quick-start/index.md b/docs/get-started/self-hosted/quick-start/index.md similarity index 98% rename from docs/get-started/snowplow-community-edition/quick-start/index.md rename to docs/get-started/self-hosted/quick-start/index.md index edf14c9cf..1e33620cc 100644 --- a/docs/get-started/snowplow-community-edition/quick-start/index.md +++ b/docs/get-started/self-hosted/quick-start/index.md @@ -6,7 +6,7 @@ sidebar_position: 2 ```mdx-code-block import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import LicenseNotice from '@site/docs/get-started/snowplow-community-edition/_license-notice.md'; +import LicenseNotice from '@site/docs/get-started/self-hosted/_license-notice.md'; ``` This guide will take you through how to spin up a Snowplow Community Edition pipeline using the [Snowplow Terraform modules](https://registry.terraform.io/namespaces/snowplow-devops). _(Not familiar with Terraform? Take a look at [Infrastructure as code with Terraform](https://learn.hashicorp.com/tutorials/terraform/infrastructure-as-code?in=terraform/aws-get-started).)_ @@ -645,7 +645,7 @@ If you have attached a custom SSL certificate and set up your own DNS records, t :::tip Terraform errors -For solutions to some common Terraform errors that you might encounter when running `terraform plan` or `terraform apply`, see the [FAQs section](/docs/get-started/snowplow-community-edition/faq/index.md#troubleshooting-terraform-errors). +For solutions to some common Terraform errors that you might encounter when running `terraform plan` or `terraform apply`, see the [FAQs section](/docs/get-started/self-hosted/faq/index.md#troubleshooting-terraform-errors). ::: @@ -810,4 +810,4 @@ You should now be able to access your service over HTTPS. Verify this by going t --- -If you are curious, here’s [what has been deployed](/docs/get-started/snowplow-community-edition/what-is-deployed/index.md). Now it’s time to [send your first events to your pipeline](/docs/get-started/tracking/index.md)! +If you are curious, here’s [what has been deployed](/docs/get-started/self-hosted/what-is-deployed/index.md). Now it’s time to [send your first events to your pipeline](/docs/get-started/tracking/index.md)! diff --git a/docs/get-started/snowplow-community-edition/telemetry/index.md b/docs/get-started/self-hosted/telemetry/index.md similarity index 98% rename from docs/get-started/snowplow-community-edition/telemetry/index.md rename to docs/get-started/self-hosted/telemetry/index.md index a4b0caf25..6dd6a2b1e 100644 --- a/docs/get-started/snowplow-community-edition/telemetry/index.md +++ b/docs/get-started/self-hosted/telemetry/index.md @@ -45,7 +45,7 @@ We also appreciate if you provide your email (or just a UUID) in the `user_provi ## Which components have telemetry? At the moment, opt-out telemetry is present in the following: -* Terraform modules for the [quick start guide](/docs/get-started/snowplow-community-edition/quick-start/index.md). +* Terraform modules for the [quick start guide](/docs/get-started/self-hosted/quick-start/index.md). * [Collector](/docs/api-reference/stream-collector/setup/index.md). * Enrich ([Enrich Kinesis](/docs/api-reference/enrichment-components/enrich-kinesis/index.md), [Enrich PubSub](/docs/api-reference/enrichment-components/enrich-pubsub/index.md), [Enrich Kafka](/docs/api-reference/enrichment-components/enrich-kafka/index.md). * RDB Loader ([Transformer Kinesis](/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/transforming-enriched-data/stream-transformer/transformer-kinesis/index.md), [Transformer PubSub](/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/transforming-enriched-data/stream-transformer/transformer-pubsub/index.md), [Redshift Loader](/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/redshift-loader/index.md), [Snowflake Loader](/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/snowflake-loader/index.md), [Databricks Loader](/docs/api-reference/loaders-storage-targets/snowplow-rdb-loader/loading-transformed-data/databricks-loader/index.md)). diff --git a/docs/get-started/snowplow-community-edition/upgrade-guide/index.md b/docs/get-started/self-hosted/upgrade-guide/index.md similarity index 100% rename from docs/get-started/snowplow-community-edition/upgrade-guide/index.md rename to docs/get-started/self-hosted/upgrade-guide/index.md diff --git a/docs/get-started/snowplow-community-edition/what-is-deployed/index.md b/docs/get-started/self-hosted/what-is-deployed/index.md similarity index 99% rename from docs/get-started/snowplow-community-edition/what-is-deployed/index.md rename to docs/get-started/self-hosted/what-is-deployed/index.md index ac748c49b..df8f590d8 100644 --- a/docs/get-started/snowplow-community-edition/what-is-deployed/index.md +++ b/docs/get-started/self-hosted/what-is-deployed/index.md @@ -7,7 +7,7 @@ sidebar_position: 3 import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import Link from '@docusaurus/Link'; -import Diagram from '@site/docs/get-started/snowplow-community-edition/_diagram.md'; +import Diagram from '@site/docs/get-started/self-hosted/_diagram.md'; export const TerraformLinks = (props) =>

For further details on the resources, default and required input variables, and outputs, see the Terraform module ( diff --git a/docs/get-started/snowplow-bdp/index.md b/docs/get-started/snowplow-bdp/index.md deleted file mode 100644 index 41107ebef..000000000 --- a/docs/get-started/snowplow-bdp/index.md +++ /dev/null @@ -1,69 +0,0 @@ ---- -title: "Setting up Snowplow Behavioral Data Platform" -sidebar_position: 3 -sidebar_label: "Snowplow BDP" -sidebar_custom_props: - offerings: - - bdp ---- - -```mdx-code-block -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -``` - -## Enterprise: in your own cloud - -BDP Enterprise is a version of [Snowplow](https://snowplow.io) hosted in your own cloud account (AWS, Azure or GCP). It enables you to create rich and well-structured data to power your advanced analytics and AI use cases. - -:::tip BDP Cloud - -Consider [BDP Cloud](#cloud-hosted-by-snowplow) for a faster setup time if you don’t wish to host your own cloud infrastructure (besides the data warehouse or lake). See [deployment model](docs/get-started/index.md) for a comparison between BDP Enterprise and BDP Cloud. - -::: - - - - -Refer to the [AWS Setup Guide](/docs/get-started/snowplow-bdp/setup-guide-aws/index.md). Once you’ve set up your cloud environment, you can [request a new pipeline](https://console.snowplowanalytics.com/pipelines/AWS/new) via the Snowplow BDP Console. - - - - -Refer to the [Azure Setup Guide](/docs/get-started/snowplow-bdp/setup-guide-azure/index.md). Once you’ve set up your cloud environment, you can [request a new pipeline](https://console.snowplowanalytics.com/pipelines/azure/new) via the Snowplow BDP Console. - - - - -Refer to the [AWS Setup Guide](/docs/get-started/snowplow-bdp/setup-guide-gcp/index.md). Once you’ve set up your cloud environment, you can [request a new pipeline](https://console.snowplowanalytics.com/pipelines/gcp/new) via the Snowplow BDP Console. - - - - -### Where is the infrastructure installed? - -Snowplow’s internal team of data engineering experts set up, manage and upgrade your data pipeline within your own cloud environment bringing all the security, data quality and data activation benefits that come from owning your infrastructure. - -:::tip - -If you’d like to learn more about Snowplow BDP you can **[book a demo with our team](https://snowplow.io/get-started/book-a-demo-of-snowplow-bdp/?utm-medium=related-content&utm_campaign=snowplow-docs)**. - -::: - -## Cloud: hosted by Snowplow - -BDP Cloud is a hosted version of [Snowplow](https://snowplow.io) designed to get your organization up and running and delivering value from behavioral data as quickly as possible. It enables you to create rich and well-structured data to power your advanced analytics and AI use cases. - -With BDP Cloud, you don’t need to set up any cloud infrastructure yourself. See [deployment model](docs/get-started/index.md) for a comparison between BDP Cloud and BDP Enterprise. - -### Where is the Snowplow pipeline hosted? - -All data processed and collected with Snowplow BDP Cloud is undertaken within Snowplow’s own cloud account. Data is stored in Snowplow’s cloud account for 7 days to provide resilience against potential failures. - -For more information, please see our [full product description](https://snowplow.io/). - -:::tip - -If you’d like to learn more about Snowplow BDP you can **[book a demo with our team](https://snowplow.io/get-started/book-a-demo-of-snowplow-bdp/?utm-medium=related-content&utm_campaign=snowplow-docs)**. - -::: diff --git a/docs/get-started/snowplow-community-edition/index.md b/docs/get-started/snowplow-community-edition/index.md deleted file mode 100644 index d92ca308c..000000000 --- a/docs/get-started/snowplow-community-edition/index.md +++ /dev/null @@ -1,30 +0,0 @@ ---- -title: "Setting up Snowplow Community Edition" -date: "2020-10-30" -sidebar_position: 4 -sidebar_label: "Snowplow Community Edition" -sidebar_custom_props: - offerings: - - community ---- - -```mdx-code-block -import LicenseNotice from '@site/docs/get-started/snowplow-community-edition/_license-notice.md'; -``` - -Snowplow Community Edition is an event data collection platform for data teams who want to manage the collection and warehousing of data across all their platforms and channels, in real-time. - - - -Snowplow is a complete, loosely coupled platform that lets you capture, store and analyze granular customer-level and event-level data: - -- Drill down to individual customers and events -- Zoom out to compare behaviors between cohorts and over time -- Join web analytics data with other data sets (e.g. CRM, media catalogue, product catalogue, offline data) -- Segment your audience by behavior -- Develop recommendations and personalisations engines - -Snowplow has been technically designed to: - -- Give you access, ownership and control of your own web analytics data (no lock in) -- Be loosely coupled and extensible, so that it is easy to add e.g. new trackers to capture data from new platforms (e.g. mobile, TV) and put the data to new uses diff --git a/docs/get-started/tracking/index.md b/docs/get-started/tracking/index.md index e6fafff87..01d63e2e1 100644 --- a/docs/get-started/tracking/index.md +++ b/docs/get-started/tracking/index.md @@ -13,7 +13,7 @@ import EventComponent from '@site/src/components/FirstSteps'; import { sampleTrackingCode } from '@site/src/components/FirstSteps/sampleTrackingCode'; ``` -Once your pipeline is set up, you will want to send some events to it. Here’s an overview of the different options. +Once your pipeline is set up, you will want to send some events to it. Here's an overview of the different options. :::tip Latency @@ -33,19 +33,19 @@ This is because web browsers block traffic from HTTPS-enabled sites (such as `ht ::: - + -You can find the Collector URL (Collector Endpoint) in the [Console](https://console.snowplowanalytics.com/environments). +You can find the Collector URL (Collector endpoint) in [Console](https://console.snowplowanalytics.com/environments). - + -You can find the Collector URL (Collector Endpoint) in the [Console](https://console.snowplowanalytics.com/environments). +You can find the Collector URL (Collector endpoint) in [Console](https://console.snowplowanalytics.com/environments). - + -Input the Collector URL you’ve chosen when deploying your Community Edition pipeline. +Input the Collector URL you chose when deploying your Snowplow Self-Hosted pipeline. If you have not yet configured an SSL certificate and a custom domain name for your Collector, you can use `http://` (`http`, not `https`), where `collector_dns_name` is the output of the pipeline Terraform module. @@ -81,19 +81,19 @@ The [JavaScript tracker](/docs/sources/trackers/web-trackers/quick-start-guide/i To use the JavaScript tracker on your site, you will need to obtain a code snippet first. - + -BDP Enterprise can automatically generate the snippet for you. Go to the [tag generator](https://console.snowplowanalytics.com/tag-generator) screen, fill in the necessary parameters, and copy the snippet at the bottom. +CDI Private Managed Cloud can automatically generate the snippet for you. Go to the [tag generator](https://console.snowplowanalytics.com/tag-generator) screen, fill in the necessary parameters, and copy the snippet at the bottom. - + You can find the pre-generated snippet in the [Getting started](https://console.snowplowanalytics.com/environments/start-tracking-events?fromDocs) section. - + -Take note of the Collector URL you’ve chosen when deploying your Community Edition pipeline. +Take note of the Collector URL you’ve chosen when deploying your Snowplow Self-Hosted pipeline. If you have not yet configured an SSL certificate and a custom domain name for your Collector, you can use `http://` (`http`, not `https`), where `collector_dns_name` is the output of the pipeline Terraform module. diff --git a/docs/glossary/index.md b/docs/glossary/index.md index 1b78870da..e97ac0d1a 100644 --- a/docs/glossary/index.md +++ b/docs/glossary/index.md @@ -10,7 +10,7 @@ To help clarify, we've categorized the terms: * Component: a pipeline application, package, SDK, or other code-based tool or set of tools, e.g. Snowplow Micro * Concept: names for Snowplow things and ideas, e.g. data product * Legal: about licensing, e.g. SLULA -* Offering: a Snowplow product that you can buy, e.g. Snowplow BDP Cloud +* Offering: a Snowplow product that you can buy, e.g. Snowplow CDI Cloud | Term | Category | Description | More information | | -------------------------------------- | --------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------ | @@ -63,23 +63,25 @@ To help clarify, we've categorized the terms: | **Service** | Concept | Attribute consumption interface layer for Signals | [Services](/docs/signals/concepts/index.md#services) | | **Signals** | Offering | Real-time customer intelligence platform | [Signals](/docs/signals/index.md) | | **SLULA** | Legal | Snowplow Limited Use License Agreement, a non-production use license | [Copyright licenses](/docs/resources/copyright-license/index.md) | -| **Snowbridge** | Component | Application that sends data streams into external destinations | [Snowbridge](/docs/api-reference/snowbridge/index.md) | -| **Snowplow BDP** | Offering | Full Snowplow product offering | [Snowplow BDP](/docs/get-started/snowplow-bdp/index.md) | -| **Snowplow BDP Cloud** | Offering | Snowplow running in a cloud hosted by Snowplow | [Snowplow BDP](/docs/get-started/snowplow-bdp/index.md) | -| **Snowplow BDP Console** | Offering | Web UI for Snowplow data management | | -| **Snowplow BDP Enterprise** | Offering | Deprecated name for Snowplow BDP Private Managed Cloud | [Snowplow BDP](/docs/get-started/snowplow-bdp/index.md) | -| **Snowplow BDP Private Managed Cloud** | Offering | Snowplow running in your own cloud | [Snowplow BDP](/docs/get-started/snowplow-bdp/index.md) | +| **Snowbridge** | Component | Application that sends data streams into external destinations | [Snowbridge](/docs/api-reference/snowbridge/index.md) | +| **Snowplow BDP** | Offering | Deprecated name for Snowplow CDI | [Snowplow CDI](/docs/get-started/index.md#customer-data-infrastructure) | +| **Snowplow BDP Enterprise** | Offering | Deprecated name for Snowplow CDI Private Managed Cloud | [Snowplow CDI](/docs/get-started/index.md#customer-data-infrastructure) | +| **Snowplow CDI** | Offering | Full Snowplow product offering | [Snowplow CDI](/docs/get-started/index.md#customer-data-infrastructure) | +| **Snowplow CDI Cloud** | Offering | Snowplow running in a cloud hosted by Snowplow | [Snowplow CDI](/docs/get-started/index.md#customer-data-infrastructure) | +| **Snowplow CDI Private Managed Cloud** | Offering | Snowplow running in your own cloud | [Snowplow CDI](/docs/get-started/index.md#customer-data-infrastructure) | | **Snowplow CLI** | Component | Tool for command-line management of data products and data structures | [Snowplow CLI](/docs/data-product-studio/snowplow-cli/index.md) | -| **Snowplow Community Edition** | Offering | Limited license (SLULA) Snowplow product offering | [Snowplow Community Edition](/docs/get-started/snowplow-community-edition/index.md) | +| **Snowplow Community Edition** | Offering | Limited license (SLULA) Snowplow self-hosted product offering | [Snowplow Self-Hosted](/docs/get-started/index.md#self-hosted) | +| **Snowplow Console** | Offering | Web UI for Snowplow data management | [Snowplow Console](https://console.snowplowanalytics.com) | | **Snowplow Inspector** | Component | Browser extension for testing and validation of web tracking | [Snowplow Inspector](/docs/data-product-studio/data-quality/snowplow-inspector/index.md) | | **Snowplow Micro** | Component | Testing and QA pipeline | [Snowplow Micro](/docs/data-product-studio/data-quality/snowplow-micro/index.md) | | **Snowplow Mini** | Component | Sandbox development pipeline | [Snowplow Mini](/docs/api-reference/snowplow-mini/index.md) | +| **Snowplow Self-Hosted** | Offering | Production self-hosted Snowplow product offering | [Snowplow Self-Hosted](/docs/get-started/index.md#self-hosted) | | **Snowtype** | Component | Tool for custom tracking code generation | [Snowtype](/docs/data-product-studio/snowtype/index.md) | | **Snowtype CLI** | Component | Tool for command-line management of Snowtype code generation | [Snowtype CLI](/docs/data-product-studio/snowtype/using-the-cli/index.md) | | **Source** | Concept | Tracker SDK or webhook that sends events to a collector endpoint | [Sources](/docs/sources/index.md) | | **Source application** | Concept | Your application where you have implemented tracking | [Source applications](/docs/data-product-studio/source-applications/index.md) | | **Tracker** | Component | SDK that sends events to an event collector endpoint | [Trackers](/docs/sources/trackers/index.md) | -| **Tracking catalog** | Concept | List of received event types in Snowplow BDP Console | +| **Tracking catalog** | Concept | List of received event types in Snowplow Console | | **TSV** | Concept | Tab-separated values format, used for events within a Snowplow pipeline | [Enriched TSV format](/docs/fundamentals/canonical-event/understanding-the-enriched-tsv-format/index.md#overview---tsv-format) | | **Validation** | Concept | Pipeline process of checking events against their schemas | | | **Visualization** | Offering | Accompanying dashboard for a data model | [Visualizations](/docs/modeling-your-data/visualization/index.md) | diff --git a/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-quickstart/legacy/fractribution/index.md b/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-quickstart/legacy/fractribution/index.md index 32a302e6b..83af6835d 100644 --- a/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-quickstart/legacy/fractribution/index.md +++ b/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-quickstart/legacy/fractribution/index.md @@ -133,7 +133,7 @@ import FractributionDbtMacros from "@site/docs/reusable/fractribution-dbt-macros ``` ### 4. Run the model -Execute the following either through your CLI, within dbt Cloud, or within [Snowplow BDP](/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md) +Execute the following either through your CLI, within dbt Cloud, or within [Snowplow Console](/docs/modeling-your-data/running-data-models-via-console/dbt/index.md) ```yml dbt run --select snowplow_fractribution diff --git a/docs/modeling-your-data/modeling-your-data-with-dbt/index.md b/docs/modeling-your-data/modeling-your-data-with-dbt/index.md index b523ecc00..2aa19b98c 100644 --- a/docs/modeling-your-data/modeling-your-data-with-dbt/index.md +++ b/docs/modeling-your-data/modeling-your-data-with-dbt/index.md @@ -20,9 +20,9 @@ dark: require('./images/dbt_packages-dark.drawio.png').default

-To setup dbt, Snowplow Community Edition users can start with the [dbt User Guide](https://docs.getdbt.com/guides/getting-started) and then we have prepared some [introduction videos](https://www.youtube.com/watch?v=1kd6BJhC4BE) for working with the Snowplow dbt packages. +To setup dbt, Snowplow Self-Hosted users can start with the [dbt User Guide](https://docs.getdbt.com/guides/getting-started) and then we have prepared some [introduction videos](https://www.youtube.com/watch?v=1kd6BJhC4BE) for working with the Snowplow dbt packages. -For Snowplow BDP customers, dbt projects can be configured and scheduled in the console meaning you can [get started](/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md) running dbt models alongside your Snowplow pipelines. +For Snowplow CDI customers, dbt projects can be configured and scheduled in the console meaning you can [get started](/docs/modeling-your-data/running-data-models-via-console/dbt/index.md) running dbt models alongside your Snowplow pipelines. # Snowplow dbt Packages diff --git a/docs/modeling-your-data/modeling-your-data-with-sql-runner/migrating-to-dbt/index.md b/docs/modeling-your-data/modeling-your-data-with-sql-runner/migrating-to-dbt/index.md index 446d02855..7cbf151b2 100644 --- a/docs/modeling-your-data/modeling-your-data-with-sql-runner/migrating-to-dbt/index.md +++ b/docs/modeling-your-data/modeling-your-data-with-sql-runner/migrating-to-dbt/index.md @@ -17,7 +17,7 @@ This guide assumes you are running the standard web and/or mobile SQL Runner mod ## Why Migrate? -SQL Runner is currently in maintenance mode, while we will continue to fix bugs when they are identified, we are not actively developing the tool or the models anymore and at some point in the future may deprecate it entirely. Our dbt models on the other hand are under active development, with new features and optimizations being made regularly. It is also a more widely used tool, meaning installation and management is far easier (or you can use tools like dbt Cloud, or our BDP customers can run dbt models the same way you can SQL Runner). We also have a far wider range of packages available in dbt including e-commerce, marketing attribution, and a package to normalize your Snowplow data. +SQL Runner is currently in maintenance mode, while we will continue to fix bugs when they are identified, we are not actively developing the tool or the models anymore and at some point in the future may deprecate it entirely. Our dbt models on the other hand are under active development, with new features and optimizations being made regularly. It is also a more widely used tool, meaning installation and management is far easier (or you can use tools like dbt Cloud, or our CDI customers can run dbt models the same way you can SQL Runner). We also have a far wider range of packages available in dbt including e-commerce, marketing attribution, and a package to normalize your Snowplow data. In dbt we also support Databricks & Postgres warehouses in addition to Snowflake, BigQuery, and Redshift. Our [Accelerators](https://snowplow.io/data-product-accelerators/) contain our dbt models, and newer tracking plugins are only being modeled within our dbt packages. @@ -30,11 +30,11 @@ The core of the web and mobile models (e.g. page/screen views, sessions, and use We recommend you take a look at the [docs](/docs/modeling-your-data/modeling-your-data-with-dbt/index.md) for our dbt packages to get a better understanding of how they work and how you can use them going forward. ## Pre-requisites -We assume that you have dbt [installed](https://docs.getdbt.com/docs/core/installation), a working connection, and some basic understanding of using dbt including installing packages and running models. +We assume that you have dbt [installed](https://docs.getdbt.com/docs/core/installation), a working connection, and some basic understanding of using dbt including installing packages and running models. ## Mapping the variables -While the variables for your SQL Runner models are spread throughout the files, in dbt all variables are in your `dbt_project.yml` file. We have mostly been consistent between the two tools, with the dbt variables being prefixed by `snowplow__`, but some have new names. The table below maps each SQL Runner variable to the equivalent dbt variable, but there are many more you can set to customize how the models run - you can read about these in the relevant [configuration](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-configuration/index.md) page. +While the variables for your SQL Runner models are spread throughout the files, in dbt all variables are in your `dbt_project.yml` file. We have mostly been consistent between the two tools, with the dbt variables being prefixed by `snowplow__`, but some have new names. The table below maps each SQL Runner variable to the equivalent dbt variable, but there are many more you can set to customize how the models run - you can read about these in the relevant [configuration](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-configuration/index.md) page. ```mdx-code-block import DbtVariables from "@site/docs/reusable/dbt-variables/_index.md" @@ -44,39 +44,39 @@ import DbtVariables from "@site/docs/reusable/dbt-variables/_index.md" > Values in bold have a different name instead of just the prefix -| SQL Runner Variable | dbt variable | -| ---------------------------------- | ------------------------------------------------------------------------------------------------------- | -| **`app_errors`** | **`snowplow__enable_app_errors_module`** | -| **`app_id_filters`** | **`snowplow__app_id`** | -| **`application_context`** | **`snowplow__enable_application_context`** | -| `cleanup_mode` | No equivalent variable *(closest is `snowplow__allow_refresh` combined with dbt `--full-refresh` flag)* | -| `cluster_by` | No equivalent variable *(Clustering defined in model)* | -| `days_late_allowed` | `snowplow__days_late_allowed` | -| `derived_tstamp_partitioned` | `snowplow__derived_tstamp_partitioned` | +| SQL Runner Variable | dbt variable | +| -------------------------------------- | ------------------------------------------------------------------------------------------------------- | +| **`app_errors`** | **`snowplow__enable_app_errors_module`** | +| **`app_id_filters`** | **`snowplow__app_id`** | +| **`application_context`** | **`snowplow__enable_application_context`** | +| `cleanup_mode` | No equivalent variable *(closest is `snowplow__allow_refresh` combined with dbt `--full-refresh` flag)* | +| `cluster_by` | No equivalent variable *(Clustering defined in model)* | +| `days_late_allowed` | `snowplow__days_late_allowed` | +| `derived_tstamp_partitioned` | `snowplow__derived_tstamp_partitioned` | | **`enabled`** (Mobile app errors only) | **`snowplow__enable_app_errors_module`** | -| `ends_run` | No equivalent variable | -| `entropy` | No equivalent variable | -| **`geolocation_context`** | **`snowplow__enable_geolocation_context`** | -| `heartbeat` | `snowplow__heartbeat` | -| **`iab`** | **`snowplow__enable_iab`** | -| **`input_schema`** | **`snowplow__atomic_schema`** | -| `lookback_window_hours` | `snowplow__lookback_window_hours` | -| **`minimumVisitLength`** | **`snowplow_min_visit_length`** | -| **`mobile_context`** | **`snowplow__enable_mobile_context`** | -| **`model_version`** | No equivalent variable | -| `output_schema` | Set in `models` part of project file, see relevant configuration page for more info. | -| **`platform_filters`** | **`snowplow__platform`** | -| `scratch_schema` | Set in `models` part of project file, see relevant configuration page for more info. | -| **`screen_context`** | **`snowplow__enable_screen_contextt`** | -| `session_lookback_days` | `snowplow__session_lookback_days` *(default increased to 730)* | -| `skip_derived` | No equivalent variable *(use dbt `--select` flag)* | -| `stage_next` | No equivalent variable | -| `start_date` | `snowplow__start_date` | -| `ua_bot_filter` | `snowplow__ua_bot_filter` | -| **`ua_parser`** | **`snowplow__enable_ua`** | -| **`update_cadence_days`** | **`snowplow__backfill_limit_days`** *(default increased to 30)* | -| `upsert_lookback_days` | `snowplow__upsert_lookback_days` | -| **`yauaa`** | **`snowplow__enable_yauaa`** | +| `ends_run` | No equivalent variable | +| `entropy` | No equivalent variable | +| **`geolocation_context`** | **`snowplow__enable_geolocation_context`** | +| `heartbeat` | `snowplow__heartbeat` | +| **`iab`** | **`snowplow__enable_iab`** | +| **`input_schema`** | **`snowplow__atomic_schema`** | +| `lookback_window_hours` | `snowplow__lookback_window_hours` | +| **`minimumVisitLength`** | **`snowplow_min_visit_length`** | +| **`mobile_context`** | **`snowplow__enable_mobile_context`** | +| **`model_version`** | No equivalent variable | +| `output_schema` | Set in `models` part of project file, see relevant configuration page for more info. | +| **`platform_filters`** | **`snowplow__platform`** | +| `scratch_schema` | Set in `models` part of project file, see relevant configuration page for more info. | +| **`screen_context`** | **`snowplow__enable_screen_contextt`** | +| `session_lookback_days` | `snowplow__session_lookback_days` *(default increased to 730)* | +| `skip_derived` | No equivalent variable *(use dbt `--select` flag)* | +| `stage_next` | No equivalent variable | +| `start_date` | `snowplow__start_date` | +| `ua_bot_filter` | `snowplow__ua_bot_filter` | +| **`ua_parser`** | **`snowplow__enable_ua`** | +| **`update_cadence_days`** | **`snowplow__backfill_limit_days`** *(default increased to 30)* | +| `upsert_lookback_days` | `snowplow__upsert_lookback_days` | +| **`yauaa`** | **`snowplow__enable_yauaa`** | ## Setting up and running our dbt packages @@ -93,7 +93,7 @@ This method will also not correctly populate the user stitching table or process ::: -There may be cases where running the dbt models from scratch is not a viable option for you, in this case it is possible to migrate your existing derived SQL Runner data into the derived tables produced by dbt, however this will result in your data being generated from two slightly differing logics. +There may be cases where running the dbt models from scratch is not a viable option for you, in this case it is possible to migrate your existing derived SQL Runner data into the derived tables produced by dbt, however this will result in your data being generated from two slightly differing logics. It is advisable to produce your dbt tables into new schemas where possible, even though the derived tables should have different names; this will help keep your data separate and ensure that as we go through the following steps that dbt does not overwrite your SQL Runner tables. @@ -104,7 +104,7 @@ Postgres only supports the `MERGE` statement in version 15 and up, if you are us ::: ### Create dbt tables by doing a recent-dated run -Because of the difference in manifest tables and incremental logic between SQL Runner and dbt models it makes sense to first create the dbt tables and then insert your existing data into them, rather than try and create the dbt tables directly from your SQL Runner data. +Because of the difference in manifest tables and incremental logic between SQL Runner and dbt models it makes sense to first create the dbt tables and then insert your existing data into them, rather than try and create the dbt tables directly from your SQL Runner data. Once you have your dbt project and variables set up, change your `snowplow__start_date` to a recent date, say 7 days before the end of your last SQL Runner processed date, and run the project once. This will produce all the dbt tables including the manifest tables needed to manage the incremental logic, and ensure a good overlap between the end of your SQL Runner processing and the start of dbt processing. @@ -129,7 +129,7 @@ MERGE INTO .snowplow_web_page_views t USING .page_views s ON T.page_view_id = s.page_view_id WHEN NOT MATCHED THEN -INSERT +INSERT ( page_view_id, event_id, @@ -230,7 +230,7 @@ INSERT operating_system_name_version, operating_system_version ) -VALUES +VALUES ( s.page_view_id, s.event_id, @@ -344,7 +344,7 @@ MERGE INTO .snowplow_web_sessions t USING .sessions s ON T.domain_sessionid = s.domain_sessionid WHEN NOT MATCHED THEN -INSERT +INSERT ( app_id, domain_sessionid, @@ -543,7 +543,7 @@ MERGE INTO .snowplow_web_users t USING .users s ON T.domain_userid = s.domain_userid WHEN NOT MATCHED THEN -INSERT +INSERT ( user_id, domain_userid, @@ -650,7 +650,7 @@ MERGE INTO .snowplow_mobile_screen_views t USING .mobile_screen_views s ON T.screen_view_id = s.screen_view_id WHEN NOT MATCHED THEN -INSERT +INSERT ( screen_view_id, event_id, @@ -785,7 +785,7 @@ MERGE INTO .snowplow_mobile_sessions t USING .mobile_session s ON T.session_id = s.session_id WHEN NOT MATCHED THEN -INSERT +INSERT ( app_id, session_id, @@ -930,7 +930,7 @@ MERGE INTO .snowplow_mobile_users t USING .mobile_users s ON t.device_user_id = s.device_user_id WHEN NOT MATCHED THEN -INSERT +INSERT ( user_id, device_user_id, @@ -1031,7 +1031,7 @@ MERGE INTO .snowplow_mobile_app_errors t USING .mobile_app_errors s ON t.event_id = s.event_id WHEN NOT MATCHED THEN -INSERT +INSERT ( event_id, app_id, diff --git a/docs/modeling-your-data/modeling-your-data-with-sql-runner/sql-runner-web-data-model/index.md b/docs/modeling-your-data/modeling-your-data-with-sql-runner/sql-runner-web-data-model/index.md index 92ef79f6d..cc574af9c 100644 --- a/docs/modeling-your-data/modeling-your-data-with-sql-runner/sql-runner-web-data-model/index.md +++ b/docs/modeling-your-data/modeling-your-data-with-sql-runner/sql-runner-web-data-model/index.md @@ -37,7 +37,7 @@ Password can be left as a `PASSWORD_PLACEHOLDER`, and set as an environment var Variables in each module's playbook can also optionally be configured also. See each playbook directory's README for more detail on configuration of each module. -You can then run the model, either by running playbooks individually by running SQL-runner locally, or via your Snowplow BDP GitHub repository. Of course, as a Snowplow BDP customer you can also reach out to Support to get the model deployed for you. +You can then run the model, either by running playbooks individually by running SQL-runner locally, or via your Snowplow GitHub repository. Of course, as a Snowplow customer you can also reach out to Support to get the model deployed for you. ## Technical architecture diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/Screenshot-2021-11-15-at-20.15.28.png b/docs/modeling-your-data/running-data-models-via-console/dbt/images/Screenshot-2021-11-15-at-20.15.28.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/Screenshot-2021-11-15-at-20.15.28.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/images/Screenshot-2021-11-15-at-20.15.28.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/Screenshot-2021-11-15-at-20.25.53.png b/docs/modeling-your-data/running-data-models-via-console/dbt/images/Screenshot-2021-11-15-at-20.25.53.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/Screenshot-2021-11-15-at-20.25.53.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/images/Screenshot-2021-11-15-at-20.25.53.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-model-create-step-1.png b/docs/modeling-your-data/running-data-models-via-console/dbt/images/data-model-create-step-1.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-model-create-step-1.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/images/data-model-create-step-1.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-model-create-step-2.png b/docs/modeling-your-data/running-data-models-via-console/dbt/images/data-model-create-step-2.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-model-create-step-2.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/images/data-model-create-step-2.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-model-create-step-3-selectexclude.png b/docs/modeling-your-data/running-data-models-via-console/dbt/images/data-model-create-step-3-selectexclude.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-model-create-step-3-selectexclude.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/images/data-model-create-step-3-selectexclude.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-model-create-step-3.png b/docs/modeling-your-data/running-data-models-via-console/dbt/images/data-model-create-step-3.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-model-create-step-3.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/images/data-model-create-step-3.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-models-navbar.png b/docs/modeling-your-data/running-data-models-via-console/dbt/images/data-models-navbar.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/data-models-navbar.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/images/data-models-navbar.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/image-1.png b/docs/modeling-your-data/running-data-models-via-console/dbt/images/image-1.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/image-1.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/images/image-1.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/image.png b/docs/modeling-your-data/running-data-models-via-console/dbt/images/image.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/images/image.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/images/image.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md b/docs/modeling-your-data/running-data-models-via-console/dbt/index.md similarity index 87% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md rename to docs/modeling-your-data/running-data-models-via-console/dbt/index.md index 684d018e1..9d251c0ad 100644 --- a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md +++ b/docs/modeling-your-data/running-data-models-via-console/dbt/index.md @@ -1,5 +1,5 @@ --- -title: "Running custom dbt models via Snowplow BDP" +title: "Running custom dbt models via Snowplow CDI" sidebar_label: "Custom models" sidebar_position: 2 --- @@ -11,7 +11,7 @@ import TabItem from '@theme/TabItem'; ### Overview -If you are a Snowplow BDP customer, you can get started with configuring and deploying dbt projects as outlined in the steps below. For more information about setting up your dbt project you can look at the [Snowplow dbt docs](/docs/modeling-your-data/modeling-your-data-with-dbt/index.md). +If you are a Snowplow CDI customer, you can get started with configuring and deploying dbt projects as outlined in the steps below. For more information about setting up your dbt project you can look at the [Snowplow dbt docs](/docs/modeling-your-data/modeling-your-data-with-dbt/index.md). As an initial overview, in your snowplow-pipeline repository, your data models reside in the dbt directory. To start with, your GitHub repository will look like this (you may have additional folders based on your project e.g. `dbt_packages` or `docs`): @@ -46,11 +46,11 @@ When the schedule kicks off, the data model configuration is loaded and validate ::: -Read below for more details on the steps to configure and run your dbt data models with Snowplow BDP. +Read below for more details on the steps to configure and run your dbt data models with Snowplow. ### 1. Setup your dbt profile -You need to provide your prod connection profile for the warehouse you are connecting to in the `profiles.yml` file for **each datamodel**. Ensure that your profile and target are set to `prod`. See [the dbt adapters docs](https://docs.getdbt.com/docs/supported-data-platforms#verified-adapters) for more specific configuration information for each database. +You need to provide your prod connection profile for the warehouse you are connecting to in the `profiles.yml` file for **each data model**. Ensure that your profile and target are set to `prod`. See [the dbt adapters docs](https://docs.getdbt.com/docs/supported-data-platforms#verified-adapters) for more specific configuration information for each database. @@ -159,7 +159,7 @@ profile_name: :::info -The warehouse password should be sent by [secure form from the Snowplow BDP Console](https://console.snowplowanalytics.com/secure-messaging/freeform) in order to set the environment variables. +The warehouse password should be sent by [secure form from Snowplow Console](https://console.snowplowanalytics.com/secure-messaging/freeform) in order to set the environment variables. ::: @@ -173,7 +173,7 @@ import DbtPrivs from "@site/docs/reusable/dbt-privs/_index.md" ### 2. The data modeling configuration -Data models can be configured via the [Data Models](https://console.snowplowanalytics.com/data-models) page in Snowplow BDP Console: +Data models can be configured via the [Data Models](https://console.snowplowanalytics.com/data-models) page in Snowplow Console: ![](images/data-models-navbar.png) @@ -234,7 +234,7 @@ Please make sure all your dbt project files are merged to the default branch in ### 3. Model execution -Once everything is set up, Snowplow BDP Console will run the following commands in this order: +Once everything is set up, Console will run the following commands in this order: 1. `dbt deps` (if a `packages.yml` file is present) 2. `dbt seed` 3. `dbt snapshot` @@ -243,6 +243,6 @@ Once everything is set up, Snowplow BDP Console will run the following commands This ensures that the correct package dependencies are installed, that seeds are uploaded and refreshed, that snapshots are taken, that the dbt models are created, and that all specified tests are run. -### 4. Monitor your model in the Snowplow BDP Console +### 4. Monitor your model in Console -After everything has been set up and has executed, you can now monitor your data models running against your data warehouse from the Snowplow BDP Console, in the Jobs UI! There you can see the data modeling DAG generated, and monitor the status, duration and run times of the data model. You can also browse through the logs that dbt generates during it's runs. If all seeds, snapshots, models, and tests pass you will see the `Result: SUCCEEDED` status in the Jobs UI. If any of the steps fail (including tests that result in a warning), you will see the `Result: FAILED` status. +After everything has been set up and has executed, you can now monitor your data models running against your data warehouse from Console, in the Jobs UI. There you can see the data modeling DAG generated, and monitor the status, duration and run times of the data model. You can also browse through the logs that dbt generates during it's runs. If all seeds, snapshots, models, and tests pass you will see the `Result: SUCCEEDED` status in the Jobs UI. If any of the steps fail (including tests that result in a warning), you will see the `Result: FAILED` status. diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/resolving-data-model-failures/images/dbt-dag.png b/docs/modeling-your-data/running-data-models-via-console/dbt/resolving-data-model-failures/images/dbt-dag.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/resolving-data-model-failures/images/dbt-dag.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/resolving-data-model-failures/images/dbt-dag.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/resolving-data-model-failures/images/dbt-step-error-output.png b/docs/modeling-your-data/running-data-models-via-console/dbt/resolving-data-model-failures/images/dbt-step-error-output.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/resolving-data-model-failures/images/dbt-step-error-output.png rename to docs/modeling-your-data/running-data-models-via-console/dbt/resolving-data-model-failures/images/dbt-step-error-output.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/resolving-data-model-failures/index.md b/docs/modeling-your-data/running-data-models-via-console/dbt/resolving-data-model-failures/index.md similarity index 83% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/resolving-data-model-failures/index.md rename to docs/modeling-your-data/running-data-models-via-console/dbt/resolving-data-model-failures/index.md index c558f51a6..76d8e0475 100644 --- a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/resolving-data-model-failures/index.md +++ b/docs/modeling-your-data/running-data-models-via-console/dbt/resolving-data-model-failures/index.md @@ -6,7 +6,7 @@ sidebar_position: 200 :::note -This documentation assumes you are running your data models via the data modeling UI in Snowplow BDP Console, as described in [the documentation for running data models via Snowplow BDP](/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md#2-the-data-modeling-configuration). +This documentation assumes you are running your data models via the data modeling UI in Snowplow Console, as described in [the documentation for running data models](/docs/modeling-your-data/running-data-models-via-console/dbt/index.md#2-the-data-modeling-configuration). ::: @@ -20,7 +20,7 @@ You will be able to see the details of your data model failure in the jobs inter ![](images/dbt-dag.png) -The 'Error Output' will show you the error logs from the `dbt run` call. These logs will contain the information that dbt and the database relayed back to BDP from the failure, and will match the dbt logs you get if you run it locally. +The 'Error Output' will show you the error logs from the `dbt run` call. These logs will contain the information that dbt and the database relayed back to Console from the failure, and will match the dbt logs you get if you run it locally. ![](images/dbt-step-error-output.png) diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/images/data-models.png b/docs/modeling-your-data/running-data-models-via-console/images/data-models.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/images/data-models.png rename to docs/modeling-your-data/running-data-models-via-console/images/data-models.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/images/warehouse-connections.png b/docs/modeling-your-data/running-data-models-via-console/images/warehouse-connections.png similarity index 100% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/images/warehouse-connections.png rename to docs/modeling-your-data/running-data-models-via-console/images/warehouse-connections.png diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/index.md b/docs/modeling-your-data/running-data-models-via-console/index.md similarity index 71% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/index.md rename to docs/modeling-your-data/running-data-models-via-console/index.md index 904a42da4..8c9c6a320 100644 --- a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/index.md +++ b/docs/modeling-your-data/running-data-models-via-console/index.md @@ -2,11 +2,8 @@ title: "Running data models" date: "2020-12-01" sidebar_position: 2 -description: "Guides to run data models in Snowplow BDP, both dbt and SQL Runner." +description: "Guides to run data models in Snowplow, both dbt and SQL Runner." sidebar_label: "Running data models" -sidebar_custom_props: - offerings: - - bdp --- @@ -17,10 +14,10 @@ import TabItem from '@theme/TabItem'; [Standard data models](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-models/index.md) are authored and maintained by Snowplow. Follow the steps below to create and run one. -You can also create [custom data models](/docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md). +You can also create [custom data models](/docs/modeling-your-data/running-data-models-via-console/dbt/index.md). ## Create a warehouse connection -Begin by creating a new warehouse connection. It will be used by a data model to connect to your warehouse.Go to [Destinations/Connections](https://console.snowplowanalytics.com/connections), click on "Set up connection", and fill in all the necessary details. +Begin by creating a new warehouse connection. It will be used by a data model to connect to your warehouse. Go to [Destinations/Connections](https://console.snowplowanalytics.com/connections), click on "Set up connection", and fill in all the necessary details. ![](./images/warehouse-connections.png) @@ -38,4 +35,4 @@ To create a new data model, click the "Add data model" button in the [Data Model ![](images/data-models.png) ## Monitor data model runs -After you've set everything up, Snowplow BDP Console will run the model according to the provided schedule. You can monitor your data model runs on the [Jobs page](https://console.snowplowanalytics.com/jobs). +After you've set everything up, Snowplow Console will run the model according to the provided schedule. You can monitor your data model runs on the [Jobs page](https://console.snowplowanalytics.com/jobs). diff --git a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/retrieving-job-execution-data-via-the-api/index.md b/docs/modeling-your-data/running-data-models-via-console/retrieving-job-execution-data-via-the-api/index.md similarity index 97% rename from docs/modeling-your-data/running-data-models-via-snowplow-bdp/retrieving-job-execution-data-via-the-api/index.md rename to docs/modeling-your-data/running-data-models-via-console/retrieving-job-execution-data-via-the-api/index.md index 549df5f86..a7febf08e 100644 --- a/docs/modeling-your-data/running-data-models-via-snowplow-bdp/retrieving-job-execution-data-via-the-api/index.md +++ b/docs/modeling-your-data/running-data-models-via-console/retrieving-job-execution-data-via-the-api/index.md @@ -4,7 +4,7 @@ sidebar_position: 3 sidebar_label: "Retrieve job execution data" --- -The API that powers the warehouse jobs monitoring view in Snowplow BDP Console (Jobs) is also available for consumption by other authenticated clients. +The API that powers the warehouse jobs monitoring view in Snowplow Console (Jobs) is also available for consumption by other authenticated clients. The exact same data about past and current jobs executions can be retrieved and processed programmatically. Hence, it possible to integrate with your monitoring infrastructure and enable additional alerting or insights. diff --git a/docs/modeling-your-data/visualization/index.md b/docs/modeling-your-data/visualization/index.md index 5649135fe..436627443 100644 --- a/docs/modeling-your-data/visualization/index.md +++ b/docs/modeling-your-data/visualization/index.md @@ -2,9 +2,6 @@ title: "Visualizations" sidebar_position: 8 sidebar_label: "Visualizations" -sidebar_custom_props: - offerings: - - bdp --- ```mdx-code-block @@ -65,10 +62,10 @@ The installation workflow will look something like this: ### Data model dependencies -Generally, visualizations will depend on data models. If there are dependencies, the installation flow will highlight which models are required and what models you currently have [running via BDP](/docs/modeling-your-data/running-data-models-via-snowplow-bdp/index.md). It will also highlight any properties that you need to enable or configure for these data models. +Generally, visualizations will depend on data models. If there are dependencies, the installation flow will highlight which models are required and what models you currently have [running via Console](/docs/modeling-your-data/running-data-models-via-console/index.md). It will also highlight any properties that you need to enable or configure for these data models. :::note Manual configuration for Open Source -If you are running the necessary data models yourself outside of BDP, then you will need to manually check that your setup satisfies the requirements for each visualization. These requirements are listed within the documentation pages for each visualization. +If you are running the necessary data models yourself outside of Console, then you will need to manually check that your setup satisfies the requirements for each visualization. These requirements are listed within the documentation pages for each visualization. ::: ## Warehouse connection diff --git a/docs/pipeline/collector/index.md b/docs/pipeline/collector/index.md index 298d5d2e1..adc37c567 100644 --- a/docs/pipeline/collector/index.md +++ b/docs/pipeline/collector/index.md @@ -8,9 +8,9 @@ Once your [event collector](/docs/fundamentals/index.md) is set up, along with [ Read more about the technical aspects of the collector [here](/docs/api-reference/stream-collector/index.md). -## Viewing collector configuration in Snowplow BDP Console +## Viewing collector configuration in Snowplow Console -The easiest way to access collector configuration is to view it within the Snowplow BDP Console. To do that, after you log in click on _Pipeline Configuration_ under the respective pipeline's navigation menu: +The easiest way to access collector configuration is to view it within the Snowplow Console. To do that, after you log in click on _Pipeline Configuration_ under the respective pipeline's navigation menu: ![](images/image-1.png) @@ -26,7 +26,7 @@ This view is consuming the respective API that you can also access, as discussed ## Consuming the collector configuration API -As a Snowplow BDP customer you already benefit from 24x7 monitoring of pipeline collector health. If you wish to add collector monitoring to your internal monitoring systems nevertheless, the maintainable way to do this is to retrieve collector endpoints and other configuration values via the available API, then invoke your health checks on them. +As a Snowplow customer you already benefit from 24x7 monitoring of pipeline collector health. If you wish to add collector monitoring to your internal monitoring systems nevertheless, the maintainable way to do this is to retrieve collector endpoints and other configuration values via the available API, then invoke your health checks on them. ### Authorization @@ -139,6 +139,6 @@ The `cookieAttributes` object is always expected to be available and contains th Finally, `blockUnencrypted` is an optional boolean property indicating whether un-encrypted traffic should be allowed or not. If not available, the default is `false` (i.e. "do not block"). -## Configuring the collector for Community Edition users +## Configuring the collector for Snowplow Self-Hosted users -After you have installed the [Collector](/docs/api-reference/stream-collector/index.md) (either following the [Quick Start guide](/docs/get-started/snowplow-community-edition/what-is-quick-start/index.md) or [manually](/docs/api-reference/stream-collector/setup/index.md)), you can follow the [reference page](/docs/api-reference/stream-collector/configure/index.md) to configure it. +After you have installed the [Collector](/docs/api-reference/stream-collector/index.md) (either following the [Quick Start guide](/docs/get-started/self-hosted/index.md) or [manually](/docs/api-reference/stream-collector/setup/index.md)), you can follow the [reference page](/docs/api-reference/stream-collector/configure/index.md) to configure it. diff --git a/docs/pipeline/enrichments/managing-enrichments/index.md b/docs/pipeline/enrichments/managing-enrichments/index.md index b8aa19ecc..fc9516af1 100644 --- a/docs/pipeline/enrichments/managing-enrichments/index.md +++ b/docs/pipeline/enrichments/managing-enrichments/index.md @@ -1,13 +1,10 @@ --- title: "Managing enrichments" sidebar_position: 30 -sidebar_custom_props: - offerings: - - bdp --- -Snowplow BDP Console enables you to manage the Enrichments that run on each of your pipelines. +Snowplow Console enables you to manage the Enrichments that run on each of your pipelines. To start managing Enrichments, navigate to the pipeline you'd like to manage and click on the _Enrichments_ tab. diff --git a/docs/pipeline/enrichments/managing-enrichments/terraform/index.md b/docs/pipeline/enrichments/managing-enrichments/terraform/index.md index 3d5e5a159..20b2c11a5 100644 --- a/docs/pipeline/enrichments/managing-enrichments/terraform/index.md +++ b/docs/pipeline/enrichments/managing-enrichments/terraform/index.md @@ -1,14 +1,11 @@ --- -title: "Managing enrichments in Snowplow Community Edition" +title: "Managing enrichments in Snowplow Self-Hosted" date: "2021-10-06" -sidebar_label: "Community Edition: Terraform" +sidebar_label: "Snowplow Self-Hosted: Terraform" sidebar_position: 15 -sidebar_custom_props: - offerings: - - community --- -If you have installed Snowplow via [Quick Start](/docs/get-started/snowplow-community-edition/what-is-quick-start/index.md), you will have the following enrichments enabled by default: +If you have installed Snowplow via [Quick Start](/docs/get-started/self-hosted/index.md), you will have the following enrichments enabled by default: - [UA parser](/docs/pipeline/enrichments/available-enrichments/ua-parser-enrichment/index.md) - [YAUAA](/docs/pipeline/enrichments/available-enrichments/yauaa-enrichment/index.md) diff --git a/docs/pipeline/security/customer-managed-keys/index.md b/docs/pipeline/security/customer-managed-keys/index.md index 425b0c46f..d8ce2e11f 100644 --- a/docs/pipeline/security/customer-managed-keys/index.md +++ b/docs/pipeline/security/customer-managed-keys/index.md @@ -7,7 +7,7 @@ This guide provides step-by-step instructions for setting up AWS KMS (Key Manage :::note -This is a bolt-on security feature available for enterprise customers using BDP PMC (Private Managed Cloud) or BDP Cloud deployments. +This is a bolt-on security feature available for enterprise customers using CDI Private Managed Cloud or CDI Cloud deployments. ::: diff --git a/docs/resources/community-license-faq/index.md b/docs/resources/community-license-faq/index.md index 62e071f73..7de279768 100644 --- a/docs/resources/community-license-faq/index.md +++ b/docs/resources/community-license-faq/index.md @@ -43,7 +43,7 @@ Only the MIT and Apache 2 licensed software from Snowplow can be embedded and di ## I have commercially licensed software from Snowplow. Does this impact me? -No, if you have entered into a separate commercial licensing with Snowplow, for example, buying a Snowplow BDP commercial product, then the commercial license terms you have agreed to will continue to govern your use of the software. +No, if you have entered into a separate commercial licensing with Snowplow, for example, buying a Snowplow CDI commercial product, then the commercial license terms you have agreed to will continue to govern your use of the software. ## How can I contact Snowplow in case of doubts? diff --git a/docs/resources/copyright-license/index.md b/docs/resources/copyright-license/index.md index 841f7d625..4f75711ed 100644 --- a/docs/resources/copyright-license/index.md +++ b/docs/resources/copyright-license/index.md @@ -101,4 +101,4 @@ When in doubt, consult each component’s GitHub repository for the LICENSE file ## Proprietary components -[Snowplow BDP](https://snowplow.io/snowplow-bdp/) is built upon the above components, but adds a vast set of proprietary, closed source ones (UI, API, highly available deployment logic, and so on). These are only available under a commercial license for Snowplow customers. +[Snowplow CDI](https://snowplow.io/) is built upon the above components, but adds a vast set of proprietary, closed source ones (UI, API, highly available deployment logic, and so on). These are only available under a commercial license for Snowplow customers. diff --git a/docs/resources/limited-use-license-faq/index.md b/docs/resources/limited-use-license-faq/index.md index 4f4ba0ab0..fc0739782 100644 --- a/docs/resources/limited-use-license-faq/index.md +++ b/docs/resources/limited-use-license-faq/index.md @@ -75,7 +75,7 @@ If you currently run Snowplow’s open source pipeline code in production (on li If you are a current user of Snowplow open source software but do not run the core pipeline code in production or in a competitive manner, you may continue to use new Snowplow Limited Use License Agreement versions of Snowplow. -If you are a commercial Snowplow BDP Enterprise or BDP Cloud customer, this license change does not impact your use of Snowplow. +If you are a commercial Snowplow CDI Private Managed Cloud or CDI Cloud customer, this license change does not impact your use of Snowplow. ### Examples diff --git a/docs/resources/personal-and-academic-license-faq/index.md b/docs/resources/personal-and-academic-license-faq/index.md index 26326d393..b01eada55 100644 --- a/docs/resources/personal-and-academic-license-faq/index.md +++ b/docs/resources/personal-and-academic-license-faq/index.md @@ -16,7 +16,7 @@ The SPAL is based on the [PolyForm Noncommercial License](https://polyformprojec ## I have commercially licensed software from Snowplow. Does this impact me? -No. If you have entered into a separate commercial licensing with Snowplow, for example, buying a Snowplow BDP commercial product, then the commercial license terms you have agreed to will continue to govern your use of the software. +No. If you have entered into a separate commercial licensing with Snowplow, for example, buying a Snowplow CDI commercial product, then the commercial license terms you have agreed to will continue to govern your use of the software. ## Why make the source available? diff --git a/docs/reusable/data-modeling/consent/_index.md b/docs/reusable/data-modeling/consent/_index.md index a0a3c95f4..d585f2673 100644 --- a/docs/reusable/data-modeling/consent/_index.md +++ b/docs/reusable/data-modeling/consent/_index.md @@ -48,4 +48,4 @@ vars: If you have previously run the model without this optional module enabled, you can simply enable the module and run \`dbt run --selector snowplow_${props.packageName}\` as many times as needed for this module to catch up with your other data. If you only wish to process this from a specific date, be sure to change your \`snowplow__start_date\`, or refer to the [Custom module](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-custom-models/index.md) section for a detailed guide on how to achieve this the most efficient way.`}/> +If you haven't run the web package before, then you can run it using \`dbt run --selector snowplow_${props.packageName}\` either through your CLI, within dbt Cloud, or for Snowplow CDI customers you can use Snowplow Console. In this situation, all models will start in-sync as no events have been processed.`}/> diff --git a/docs/reusable/data-modeling/core-web-vitals/_index.md b/docs/reusable/data-modeling/core-web-vitals/_index.md index 9b1506dd1..894b0ee22 100644 --- a/docs/reusable/data-modeling/core-web-vitals/_index.md +++ b/docs/reusable/data-modeling/core-web-vitals/_index.md @@ -67,4 +67,4 @@ case when lcp_result = 'good' and fid_result = 'good' and cls_result = 'good' th If you have previously run the model without this optional module enabled, you can simply enable the module and run \`dbt run --selector snowplow_${props.packageName}\` as many times as needed for this module to catch up with your other data. If you only wish to process this from a specific date, be sure to change your \`snowplow__start_date\`, or refer to the [Custom module](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-custom-models/index.md) section for a detailed guide on how to achieve this the most efficient way.`}/> +If you haven't run the web package before, then you can run it using \`dbt run --selector snowplow_${props.packageName}\` either through your CLI, within dbt Cloud, or for Snowplow CDI customers you can use Snowplow Console. In this situation, all models will start in-sync as no events have been processed.`}/> diff --git a/docs/reusable/telemetry/_index.md b/docs/reusable/telemetry/_index.md index 8d62eb4ee..57762761d 100644 --- a/docs/reusable/telemetry/_index.md +++ b/docs/reusable/telemetry/_index.md @@ -11,6 +11,6 @@ This data is anonymous and minimal, and since our code is open source, you can i
{props.children}
-See our [telemetry principles](/docs/get-started/snowplow-community-edition/telemetry/index.md) for more information. +See our [telemetry principles](/docs/get-started/self-hosted/telemetry/index.md) for more information. \ No newline at end of file diff --git a/docs/sources/first-party-tracking/index.md b/docs/sources/first-party-tracking/index.md index 757d33f51..09e6485ec 100644 --- a/docs/sources/first-party-tracking/index.md +++ b/docs/sources/first-party-tracking/index.md @@ -2,8 +2,6 @@ title: "First-party tracking" sidebar_position: 40 sidebar_custom_props: - offerings: - - bdp # hide from sidebar and external search until these instructions apply more universally hidden: true --- @@ -26,7 +24,7 @@ Before starting, ensure you can access and edit the configuration of your hostin :::info -The flow described below might differ depending on the version of Snowplow BDP you are using. +The flow described below might differ depending on the version of Snowplow you are using. ::: @@ -73,7 +71,7 @@ For example, if you own both `gardening.primary-domain.co.uk` and `insurance.pri ![enter_domain](images/Screenshot_enter_domain.png) ## Configuring DNS records -In the next step, BDP will generate the required DNS records. This may take several minutes. +In the next step, Snowplow will generate the required DNS records. This may take several minutes. When the records are ready, you will receive a confirmation by email. @@ -81,13 +79,13 @@ When the records are ready, you will receive a confirmation by email. Once the DNS records are available, copy them into your domain provider. -The set of DNS records will contain a special record that allows BDP to verify that the setup is correct. +The set of DNS records will contain a special record that allows Snowplow to verify that the setup is correct. :::info -You will have to conclude this step within 72 hours. If BDP is unable to verify the setup within that timeframe, you will have to restart the process from the beginning. +You will have to conclude this step within 72 hours. If Snowplow is unable to verify the setup within that timeframe, you will have to restart the process from the beginning. ::: -Once BDP verifies your DNS setup, the status in the bottom left will change and you will receive a confirmation by email. +Once Snowplow verifies your DNS setup, the status in the bottom left will change and you will receive a confirmation by email. ![dns_records](images/Screenshot_dns_records.png) diff --git a/docs/sources/trackers/pixel-tracker/index.md b/docs/sources/trackers/pixel-tracker/index.md index 11c049779..00bef348e 100644 --- a/docs/sources/trackers/pixel-tracker/index.md +++ b/docs/sources/trackers/pixel-tracker/index.md @@ -45,7 +45,7 @@ Identify the event you wish to track. This may be opening a particular email tha ### Use the pixel tag generator -Snowplow BDP customers can generate a pixel tracker aimed at structured events via the [Snowplow BDP Console](https://console.snowplowanalytics.com/pixel-tracker). +Snowplow customers can generate a pixel tracker aimed at structured events via [Snowplow Console](https://console.snowplowanalytics.com/pixel-tracker). #### Choose your collector domain @@ -78,7 +78,7 @@ You can use the Pixel tracker for click tracking aka URI redirects: - On clicking this link, the collector will register the link and then do a 302 redirect to the supplied `{{uri}}` - As well as the `&u={{uri}}` parameter, you can populate the collector URI with any other fields from the [Snowplow Tracker Protocol](/docs/events/index.md) -Redirect tracking is usually disabled by default, and is disabled by default for all Snowplow BDP users. To use this feature, you need to enable this in your collector configuration. Snowplow BDP customers can enable this from within the Pipeline Configuration screen of the Snowplow BDP Console. +Redirect tracking is usually disabled by default, and is disabled by default for all Snowplow customers. To use this feature, you need to enable this in your collector configuration. Snowplow customers can enable this from within the Pipeline Configuration screen of Snowplow Console. You should also restrict values which are allowed within the `u` parameter to prevent phising attacks using this redirect endpoint. One option is to use [AWS WAF](https://aws.amazon.com/waf/) or [Google Cloud Armor](https://cloud.google.com/armor) (depending on your cloud). They let you block traffic that matches rules you define, such as a regex that the value of the `u` parameter must match. diff --git a/docs/sources/trackers/web-trackers/cookies-and-local-storage/configuring-cookies/index.md b/docs/sources/trackers/web-trackers/cookies-and-local-storage/configuring-cookies/index.md index 4a901649a..0ac6bab46 100644 --- a/docs/sources/trackers/web-trackers/cookies-and-local-storage/configuring-cookies/index.md +++ b/docs/sources/trackers/web-trackers/cookies-and-local-storage/configuring-cookies/index.md @@ -14,8 +14,8 @@ Snowplow allows for a highly configurable cookie set up. This allows for you to In the PDF below you'll find a flow chart to help you with your cookie configuration, guiding you through the configuration options for both your [Snowplow Collector](/docs/api-reference/stream-collector/index.md) and the [Snowplow JavaScript Tracker](/docs/sources/trackers/web-trackers/index.md). -- [Cookie configuration for Snowplow Community Edition](pathname:///assets/config-calculator-snowplow-ce.pdf) -- [Cookie configuration for Snowplow BDP](pathname:///assets/config-calculator-snowplow-bdp.pdf) +- [Cookie configuration for Snowplow CDI](pathname:///assets/config-calculator-snowplow-bdp.pdf) +- [Cookie configuration for Snowplow Self-Hosted](pathname:///assets/config-calculator-snowplow-ce.pdf) ## Cookie name diff --git a/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/cookies-and-local-storage/configuring-cookies/index.md b/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/cookies-and-local-storage/configuring-cookies/index.md index 7a0e7189c..fac9bffb0 100644 --- a/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/cookies-and-local-storage/configuring-cookies/index.md +++ b/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/cookies-and-local-storage/configuring-cookies/index.md @@ -14,8 +14,8 @@ Snowplow allows for a highly configurable cookie set up. This allows for you to In the PDF below you'll find a flow chart to help you with your cookie configuration, guiding you through the configuration options for both your [Snowplow Collector](/docs/api-reference/stream-collector/index.md) and the [Snowplow JavaScript Tracker](/docs/sources/trackers/web-trackers/index.md). -- [Cookie configuration for Snowplow Community Edition](pathname:///assets/config-calculator-snowplow-ce.pdf) -- [Cookie configuration for Snowplow BDP](pathname:///assets/config-calculator-snowplow-bdp.pdf) +- [Cookie configuration for Snowplow CDI](pathname:///assets/config-calculator-snowplow-bdp.pdf) +- [Cookie configuration for Snowplow Self-Hosted](pathname:///assets/config-calculator-snowplow-ce.pdf) ## Cookie name diff --git a/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/quick-start-guide/index.md b/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/quick-start-guide/index.md index 99e0ac7ea..e9b3549ae 100644 --- a/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/quick-start-guide/index.md +++ b/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/quick-start-guide/index.md @@ -30,7 +30,7 @@ The process involves the following high level steps: - Download the latest version of the Snowplow JavaScript tracker file, `sp.js`, which can be found [here](https://github.com/snowplow/snowplow-javascript-tracker/releases). - If you are already hosting static files somewhere on your own domain, it should just be a matter of downloading and adding the `sp.js` file. Otherwise you can follow our [guides for self hosting](../tracker-setup/hosting-the-javascript-tracker/index.md), use another method of your choice, or leverage a [Third Party CDN](../tracker-setup/hosting-the-javascript-tracker/third-party-cdn-hosting/index.md) (useful for evaluation or testing). - Once you have a JS tracker available, you can add the tag snippet to your site. There are also alternative options described below for adding the tracker to your website. - - If manually inserting the tag into your website or tag management solution: Snowplow BDP users can generate a tag snippet in the Snowplow BDP Console [here](https://console.snowplowanalytics.com/tag-generator). Other users can use and edit the standard tag [here](../tracker-setup/index.md). + - If manually inserting the tag into your website or tag management solution: Snowplow customers can generate a tag snippet in Snowplow Console [here](https://console.snowplowanalytics.com/tag-generator). Other users can use and edit the standard tag [here](../tracker-setup/index.md). ```javascript ;(function(p,l,o,w,i,n,g){if(!p[i]){p.GlobalSnowplowNamespace=p.GlobalSnowplowNamespace||[]; p.GlobalSnowplowNamespace.push(i);p[i]=function(){(p[i].q=p[i].q||[]).push(arguments) };p[i].q=p[i].q||[];n=l.createElement(o);g=l.getElementsByTagName(o)[0];n.async=1; n.src=w;g.parentNode.insertBefore(n,g)}}(window,document,"script","{{URL to sp.js}}","snowplow")); diff --git a/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/tracking-events/event-specifications/index.md b/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/tracking-events/event-specifications/index.md index 17cf94220..06ea645f1 100644 --- a/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/tracking-events/event-specifications/index.md +++ b/docs/sources/trackers/web-trackers/previous-versions/web-trackers-v3/tracking-events/event-specifications/index.md @@ -10,9 +10,9 @@ import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; ``` -The plugin allows you to integrate with Event Specifications for a selected set of plugins. The configuration for the plugin should be retrieved directly from your [Data Product](https://docs.snowplow.io/docs/fundamentals/data-products/) in the [Snowplow BDP Console](https://console.snowplowanalytics.com). +The plugin allows you to integrate with Event Specifications for a selected set of plugins. The configuration for the plugin should be retrieved directly from your [Data Product](https://docs.snowplow.io/docs/fundamentals/data-products/) in [Snowplow Console](https://console.snowplowanalytics.com). -The plugin will automatically add an Event Specification context to the events matching the configuration added. +The plugin will automatically add an Event Specification context to the events matching the configuration added. :::note The plugin is available since version 3.23 of the tracker. @@ -24,7 +24,7 @@ The plugin is available since version 3.23 of the tracker. | Tracker Distribution | Included | -|----------------------|----------| +| -------------------- | -------- | | `sp.js` | ❌ | | `sp.lite.js` | ❌ | @@ -52,8 +52,8 @@ window.snowplow( import { newTracker } from '@snowplow/browser-tracker'; import { WebVitalsPlugin } from '@snowplow/browser-plugin-event-specifications'; -newTracker('sp1', '{{collector_url}}', { - appId: 'my-app-id', +newTracker('sp1', '{{collector_url}}', { + appId: 'my-app-id', plugins: [ EventSpecificationsPlugin(/* plugin configuration */) ], }); ``` @@ -75,10 +75,10 @@ window.snowplow( ['eventSpecifications', 'EventSpecificationsPlugin'], [ { - [Plugin integration name]: { + [Plugin integration name]: { /* Key value pairs of event names and event specification ids */ }, - /* More integrations */ + /* More integrations */ } ] ); @@ -89,7 +89,7 @@ window.snowplow( ```javascript EventSpecificationsPlugin({ - [Plugin integration name]: { + [Plugin integration name]: { /* Key value pairs of event names and event specification ids */ }, /* More integrations */ diff --git a/docs/sources/trackers/web-trackers/quick-start-guide/index.md b/docs/sources/trackers/web-trackers/quick-start-guide/index.md index cc5421b73..6cc5800df 100644 --- a/docs/sources/trackers/web-trackers/quick-start-guide/index.md +++ b/docs/sources/trackers/web-trackers/quick-start-guide/index.md @@ -30,7 +30,7 @@ The process involves the following high level steps: - Download the latest version of the Snowplow JavaScript tracker file, `sp.js`, which can be found [here](https://github.com/snowplow/snowplow-javascript-tracker/releases). - If you are already hosting static files somewhere on your own domain, it should just be a matter of downloading and adding the `sp.js` file. Otherwise you can follow our [guides for self hosting](/docs/sources/trackers/web-trackers/tracker-setup/hosting-the-javascript-tracker/index.md), use another method of your choice, or leverage a [Third Party CDN](/docs/sources/trackers/web-trackers/tracker-setup/hosting-the-javascript-tracker/third-party-cdn-hosting/index.md) (useful for evaluation or testing). - Once you have a JS tracker available, you can add the tag snippet to your site. There are also alternative options described below for adding the tracker to your website. - - If manually inserting the tag into your website or tag management solution: Snowplow BDP users can generate a tag snippet in the Snowplow BDP Console [here](https://console.snowplowanalytics.com/tag-generator). Other users can use and edit the standard tag [here](/docs/sources/trackers/web-trackers/tracker-setup/index.md). + - If manually inserting the tag into your website or tag management solution: Snowplow CDI customers can generate a tag snippet in Snowplow Console [here](https://console.snowplowanalytics.com/tag-generator). Other users can use and edit the standard tag [here](/docs/sources/trackers/web-trackers/tracker-setup/index.md). ```javascript ;(function(p,l,o,w,i,n,g){if(!p[i]){p.GlobalSnowplowNamespace=p.GlobalSnowplowNamespace||[]; p.GlobalSnowplowNamespace.push(i);p[i]=function(){(p[i].q=p[i].q||[]).push(arguments) };p[i].q=p[i].q||[];n=l.createElement(o);g=l.getElementsByTagName(o)[0];n.async=1; n.src=w;g.parentNode.insertBefore(n,g)}}(window,document,"script","{{URL to sp.js}}","snowplow")); diff --git a/docs/sources/trackers/web-trackers/tracking-events/event-specifications/index.md b/docs/sources/trackers/web-trackers/tracking-events/event-specifications/index.md index c701c13e8..cdcecd42a 100644 --- a/docs/sources/trackers/web-trackers/tracking-events/event-specifications/index.md +++ b/docs/sources/trackers/web-trackers/tracking-events/event-specifications/index.md @@ -10,9 +10,9 @@ import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; ``` -The plugin allows you to integrate with Event Specifications for a selected set of plugins. The configuration for the plugin should be retrieved directly from your [Data Product](https://docs.snowplow.io/docs/fundamentals/data-products/) in the [Snowplow BDP Console](https://console.snowplowanalytics.com). +The plugin allows you to integrate with Event Specifications for a selected set of plugins. The configuration for the plugin should be retrieved directly from your [Data Product](https://docs.snowplow.io/docs/fundamentals/data-products/) in [Snowplow Console](https://console.snowplowanalytics.com). -The plugin will automatically add an Event Specification context to the events matching the configuration added. +The plugin will automatically add an Event Specification context to the events matching the configuration added. :::note The plugin is available since version 3.23 of the tracker and is currently only available for Data Products created using the [Media Web template](/docs/data-product-studio/data-products/data-product-templates/#media-web). @@ -24,7 +24,7 @@ The plugin is available since version 3.23 of the tracker and is currently only | Tracker Distribution | Included | -|----------------------|----------| +| -------------------- | -------- | | `sp.js` | ❌ | | `sp.lite.js` | ❌ | @@ -52,8 +52,8 @@ window.snowplow( import { newTracker } from '@snowplow/browser-tracker'; import { WebVitalsPlugin } from '@snowplow/browser-plugin-event-specifications'; -newTracker('sp1', '{{collector_url}}', { - appId: 'my-app-id', +newTracker('sp1', '{{collector_url}}', { + appId: 'my-app-id', plugins: [ EventSpecificationsPlugin(/* plugin configuration */) ], }); ``` @@ -75,10 +75,10 @@ window.snowplow( ['eventSpecifications', 'EventSpecificationsPlugin'], [ { - [Plugin integration name]: { + [Plugin integration name]: { /* Key value pairs of event names and event specification ids */ }, - /* More integrations */ + /* More integrations */ } ] ); @@ -89,7 +89,7 @@ window.snowplow( ```javascript EventSpecificationsPlugin({ - [Plugin integration name]: { + [Plugin integration name]: { /* Key value pairs of event names and event specification ids */ }, /* More integrations */ diff --git a/plugins/docusaurus-plugin-snowplow-schema/src/schemaData/description.md b/plugins/docusaurus-plugin-snowplow-schema/src/schemaData/description.md index cbae5d5f5..9e67b9ae1 100644 --- a/plugins/docusaurus-plugin-snowplow-schema/src/schemaData/description.md +++ b/plugins/docusaurus-plugin-snowplow-schema/src/schemaData/description.md @@ -641,7 +641,7 @@ docs/api-reference/failed-events/index.md description: "API reference for handling and processing failed behavioral events in Snowplow data quality workflows." docs/api-reference/console-api.md -description: "Snowplow BDP Console API reference for programmatic management of behavioral data platform resources." +description: "Snowplow Console API reference for programmatic management of behavioral data platform resources." docs/api-reference/index.md description: "Complete API reference documentation for Snowplow behavioral data platform components and services." @@ -2617,17 +2617,17 @@ description: "Create compelling data visualizations and dashboards from transfor docs/modeling-your-data/visualization/marketing-dashboards/index.md description: "Build marketing performance dashboards from behavioral data for campaign analysis and optimization." -docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md -description: "Run dbt data models through Snowplow BDP Console for managed behavioral data transformation workflows." +docs/modeling-your-data/running-data-models-via-console/dbt/index.md +description: "Run dbt data models through Snowplow Console for managed behavioral data transformation workflows." -docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/resolving-data-model-failures/index.md -description: "Troubleshoot and resolve dbt model failures in Snowplow BDP for reliable behavioral data processing." +docs/modeling-your-data/running-data-models-via-console/dbt/resolving-data-model-failures/index.md +description: "Troubleshoot and resolve dbt model failures in Snowplow CDI for reliable behavioral data processing." -docs/modeling-your-data/running-data-models-via-snowplow-bdp/retrieving-job-execution-data-via-the-api/index.md -description: "Retrieve job execution data via Snowplow BDP API for behavioral data model monitoring and debugging." +docs/modeling-your-data/running-data-models-via-console/retrieving-job-execution-data-via-the-api/index.md +description: "Retrieve job execution data via Snowplow CDI API for behavioral data model monitoring and debugging." -docs/modeling-your-data/running-data-models-via-snowplow-bdp/index.md -description: "Execute and manage data models through Snowplow BDP Console for streamlined behavioral analytics workflows." +docs/modeling-your-data/running-data-models-via-console/index.md +description: "Execute and manage data models through Snowplow Console for streamlined behavioral analytics workflows." docs/modeling-your-data/modeling-your-data-with-sql-runner/sql-runner-mobile-data-model/index.md description: "Build mobile analytics data models using SQL Runner for behavioral mobile app analysis." @@ -2944,16 +2944,16 @@ description: "Handle cookies and ad blockers in behavioral event tracking for ac docs/get-started/tracking/index.md description: "Implement behavioral event tracking across web, mobile, and server-side applications using Snowplow trackers." -docs/get-started/snowplow-bdp/setup-guide-gcp/index.md +docs/get-started/snowplow-CDI/setup-guide-gcp/index.md description: "Set up Snowplow Behavioral Data Platform on Google Cloud Platform for enterprise behavioral analytics infrastructure." -docs/get-started/snowplow-bdp/index.md +docs/get-started/snowplow-CDI/index.md description: "Get started with Snowplow Behavioral Data Platform for enterprise-scale customer data infrastructure and analytics." -docs/get-started/snowplow-bdp/setup-guide-azure/index.md +docs/get-started/snowplow-CDI/setup-guide-azure/index.md description: "Deploy Snowplow Behavioral Data Platform on Microsoft Azure for enterprise behavioral analytics infrastructure." -docs/get-started/snowplow-bdp/setup-guide-aws/index.md +docs/get-started/snowplow-CDI/setup-guide-aws/index.md description: "Set up Snowplow Behavioral Data Platform on Amazon Web Services for enterprise behavioral analytics infrastructure." docs/get-started/querying/index.md @@ -3131,4 +3131,4 @@ tutorials/flink-live-shopper-features/run.md description: "Run and deploy Flink live shopper features for real-time behavioral data processing and personalization." tutorials/flink-live-shopper-features/calculations.md -description: "Implement behavioral data calculations for live shopper features using Apache Flink stream processing." \ No newline at end of file +description: "Implement behavioral data calculations for live shopper features using Apache Flink stream processing." diff --git a/plugins/docusaurus-plugin-snowplow-schema/src/schemaData/keywords.md b/plugins/docusaurus-plugin-snowplow-schema/src/schemaData/keywords.md index 51d2ca99d..44914b4e6 100644 --- a/plugins/docusaurus-plugin-snowplow-schema/src/schemaData/keywords.md +++ b/plugins/docusaurus-plugin-snowplow-schema/src/schemaData/keywords.md @@ -1270,17 +1270,17 @@ docs/modeling-your-data/visualization/index.md docs/modeling-your-data/visualization/marketing-dashboards/index.md "Marketing Dashboards", "Marketing Analytics", "Campaign Analytics", "Marketing Insights", "Marketing ROI", "Campaign Performance" -docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/index.md +docs/modeling-your-data/running-data-models-via-console/dbt/index.md "DBT Models", "Data Transformation", "SQL Models", "Analytics Engineering", "Data Modeling", "DBT Pipeline" -docs/modeling-your-data/running-data-models-via-snowplow-bdp/dbt/resolving-data-model-failures/index.md +docs/modeling-your-data/running-data-models-via-console/dbt/resolving-data-model-failures/index.md "DBT Troubleshooting", "Model Failures", "DBT Debugging", "Model Errors", "Data Issues", "DBT Resolution" -docs/modeling-your-data/running-data-models-via-snowplow-bdp/retrieving-job-execution-data-via-the-api/index.md +docs/modeling-your-data/running-data-models-via-console/retrieving-job-execution-data-via-the-api/index.md "Job Execution", "API Data", "Execution Analytics", "Job Monitoring", "Pipeline API", "Execution Metrics" -docs/modeling-your-data/running-data-models-via-snowplow-bdp/index.md -"Snowplow BDP", "Managed Models", "BDP Pipeline", "Managed Analytics", "Cloud Pipeline", "Enterprise Analytics" +docs/modeling-your-data/running-data-models-via-console/index.md +"Snowplow CDI", "Managed Models", "CDI Pipeline", "Managed Analytics", "Cloud Pipeline", "Enterprise Analytics" docs/modeling-your-data/modeling-your-data-with-sql-runner/sql-runner-mobile-data-model/index.md "SQL Runner Mobile", "Mobile SQL", "Mobile Analytics", "SQL Models", "Mobile Data", "App Analytics" @@ -1597,17 +1597,17 @@ docs/get-started/tracking/cookies-and-ad-blockers/index.md docs/get-started/tracking/index.md "Event Tracking", "Analytics Tracking", "Getting Started", "Data Collection", "Tracking Setup", "Behavioral Tracking" -docs/get-started/snowplow-bdp/setup-guide-gcp/index.md -"BDP GCP", "Google Cloud", "BDP Setup", "Cloud Setup", "GCP Guide", "Enterprise Setup" +docs/get-started/snowplow-CDI/setup-guide-gcp/index.md +"CDI GCP", "Google Cloud", "CDI Setup", "Cloud Setup", "GCP Guide", "Enterprise Setup" -docs/get-started/snowplow-bdp/index.md -"Snowplow BDP", "Behavioral Data Platform", "Enterprise Analytics", "Managed Platform", "Cloud Analytics", "BDP Overview" +docs/get-started/snowplow-CDI/index.md +"Snowplow CDI", "Behavioral Data Platform", "Enterprise Analytics", "Managed Platform", "Cloud Analytics", "CDI Overview" -docs/get-started/snowplow-bdp/setup-guide-azure/index.md -"BDP Azure", "Azure Setup", "BDP Setup", "Cloud Setup", "Azure Guide", "Enterprise Setup" +docs/get-started/snowplow-CDI/setup-guide-azure/index.md +"CDI Azure", "Azure Setup", "CDI Setup", "Cloud Setup", "Azure Guide", "Enterprise Setup" -docs/get-started/snowplow-bdp/setup-guide-aws/index.md -"BDP AWS", "AWS Setup", "BDP Setup", "Cloud Setup", "AWS Guide", "Enterprise Setup" +docs/get-started/snowplow-CDI/setup-guide-aws/index.md +"CDI AWS", "AWS Setup", "CDI Setup", "Cloud Setup", "AWS Guide", "Enterprise Setup" docs/get-started/querying/index.md "Data Querying", "SQL Queries", "Getting Started", "Data Analysis", "Query Guide", "Analytics Queries" diff --git a/sidebars.js b/sidebars.js index b3016d2f4..451bae30c 100644 --- a/sidebars.js +++ b/sidebars.js @@ -13,40 +13,50 @@ const swap = (allItems, linkItems, descriptions) => { const result = allItems.flatMap((item) => { - const header = item.customProps?.header ? [{ - type: 'html', - value: item.customProps.header, - defaultStyle: true, - className: 'header', - }] : [] + const header = item.customProps?.header + ? [ + { + type: 'html', + value: item.customProps.header, + defaultStyle: true, + className: 'header', + }, + ] + : [] const className = [ item.className || '', - (item.customProps?.hidden ? 'hidden' : ''), - ...(item.customProps?.offerings || []) + item.customProps?.hidden ? 'hidden' : '', ].join(' ') if (item.type === 'category') { // a workaround for category pages not picking up the description in index.md // see https://docusaurus.io/feature-requests/p/allow-customizing-category-description-in-generated-index-cards - const customProps = descriptions[item.link?.id] ? - {...item.customProps, description: descriptions[item.link.id]} : - item.customProps - if (item.items.length > 0) return [...header, { - ...item, - className, - customProps, - items: swap(item.items, linkItems, descriptions), - }] + const customProps = descriptions[item.link?.id] + ? { ...item.customProps, description: descriptions[item.link.id] } + : item.customProps + if (item.items.length > 0) + return [ + ...header, + { + ...item, + className, + customProps, + items: swap(item.items, linkItems, descriptions), + }, + ] // a workaround for empty category pages not respecting className // see https://discord.com/channels/398180168688074762/867060369087922187/1068508121091293264 - return [...header, { - type: 'doc', - id: item.link.id, - label: item.label, - className, - customProps, - }] + return [ + ...header, + { + type: 'doc', + id: item.link.id, + label: item.label, + className, + customProps, + }, + ] } if (linkItems[item.id]) { @@ -61,7 +71,7 @@ const swap = (allItems, linkItems, descriptions) => { ] } - return [{...item, className}] + return [{ ...item, className }] }) return result diff --git a/src/pages/restrictive-access-models.md b/src/pages/restrictive-access-models.md index 5e3a325d5..be4ca3091 100644 --- a/src/pages/restrictive-access-models.md +++ b/src/pages/restrictive-access-models.md @@ -2,7 +2,7 @@ ## Introduction -Snowplow’s PMC ("Private Managed Cloud") deployment model leverages a proprietary Infrastructure-as-Code (IaC) platform to deploy and manage the various components for a Snowplow Pipeline into customer-managed cloud environments such as AWS, GCP, or Azure. Our deployment architecture has been intentionally designed over the years to strike a balance between customer-specific configurability and Snowplow’s own best practices for running this pipeline at scale. It has also been designed to operate seamlessly within our overall BDP Console experience, which is hosted and managed by Snowplow. While we do offer significant configurability for both the pipeline itself and the deployment model to accommodate the widest possible variety of customer-specific requirements, there are certain constraints that prevent us from supporting certain cloud environments or custom solutions that impose significant restrictions on access, deployment, and monitoring capabilities. +Snowplow’s PMC ("Private Managed Cloud") deployment model leverages a proprietary Infrastructure-as-Code (IaC) platform to deploy and manage the various components for a Snowplow Pipeline into customer-managed cloud environments such as AWS, GCP, or Azure. Our deployment architecture has been intentionally designed over the years to strike a balance between customer-specific configurability and Snowplow’s own best practices for running this pipeline at scale. It has also been designed to operate seamlessly within our overall Console experience, which is hosted and managed by Snowplow. While we do offer significant configurability for both the pipeline itself and the deployment model to accommodate the widest possible variety of customer-specific requirements, there are certain constraints that prevent us from supporting certain cloud environments or custom solutions that impose significant restrictions on access, deployment, and monitoring capabilities. ## Access to dedicated networking resources diff --git a/src/pages/style-guide/index.md b/src/pages/style-guide/index.md index 15bb69327..373350e9f 100644 --- a/src/pages/style-guide/index.md +++ b/src/pages/style-guide/index.md @@ -457,13 +457,13 @@ Here are two pieces of older content that only partially follow the style guide. ### Pipeline components - Console is capitalized, and doesn't have a definite article (no "the") -- It can also be called "BDP Console" or "Snowplow BDP Console" +- It can also be called "Snowplow Console" - This is fine at the start of a piece of writing but feels overly wordy if used throughout, so maybe open with that then just call it "Console" subsequently - | ✅ | ❌ | - | ------------------------------ | --------------------------------------- | - | data structures in Console | data structures in the console | - | data structures in BDP Console | data structures in the Snowplow Console | + | ✅ | ❌ | + | ----------------------------------- | --------------------------------------- | + | data structures in Console | data structures in the console | + | data structures in Snowplow Console | data structures in the Snowplow Console | - Collector is capitalized, and gets a definite article ("the") - Use "the Collector endpoint" where possible for clarity - the reader might not know what we mean by "Collector", but they probably know what an endpoint is @@ -556,7 +556,7 @@ Here are two pieces of older content that only partially follow the style guide. | SDKs | | | URLs | | -- For the **documentation** website, we have a docs plugin that adds a dotted line and a tooltip explanation to acronyms, e.g. hovering over "BDP" will show that it stands for "Behavioral Data Platform": add new acronym definitions to the `src/remark/abbreviations.js` file to enable this behavior +- For the **documentation** website, we have a docs plugin that adds a dotted line and a tooltip explanation to acronyms, e.g. hovering over "CDI" will show that it stands for "Customer Data Infrastructure": add new acronym definitions to the `src/remark/abbreviations.js` file to enable this behavior - (Technically, these are all initialisms, not acronyms) ### Capitalization diff --git a/src/remark/abbreviations.js b/src/remark/abbreviations.js index 2af1c27d1..7052d80f4 100644 --- a/src/remark/abbreviations.js +++ b/src/remark/abbreviations.js @@ -5,7 +5,6 @@ const plugin = () => { AAID: 'Android Advertising ID', ADLS: 'Azure Data Lake Storage', AWS: 'Amazon Web Services', - BDP: 'Behavioral Data Platform', CDI: 'Customer Data Infrastructure', CDN: 'Content Delivery Network', CDP: 'Customer Data Platform', @@ -26,6 +25,7 @@ const plugin = () => { OSS: 'Open Source Software', QA: 'Quality Assurance', PII: 'Personally Identifiable Information', + PMC: 'Private Managed Cloud', RDS: 'Amazon Relational Database Service', S3: 'Amazon Cloud Object Storage', SS: 'Server Side', diff --git a/src/theme/MDXContent/index.js b/src/theme/MDXContent/index.js index 6962e163d..8194b0b91 100644 --- a/src/theme/MDXContent/index.js +++ b/src/theme/MDXContent/index.js @@ -5,13 +5,8 @@ import Head from '@docusaurus/Head' import { useSidebarBreadcrumbs } from '@docusaurus/theme-common/internal' import _ from 'lodash' -const offeringNames = { - bdp: "Snowplow BDP", - community: "Snowplow Community Edition", -} - export default function MDXContentWrapper(props) { - let breadcrumbs; + let breadcrumbs try { breadcrumbs = useSidebarBreadcrumbs() } catch { @@ -21,13 +16,22 @@ export default function MDXContentWrapper(props) { const admonitions = [] - const legacy = _.some(_.initial(breadcrumbs), item => item.customProps?.legacy) - const outdated = !legacy && _.some(_.initial(breadcrumbs), item => item.customProps?.outdated) - const hidden = _.some(_.initial(breadcrumbs), item => item.customProps?.hidden) - const offerings = _.findLast(breadcrumbs, item => item.customProps?.offerings) + const legacy = _.some( + _.initial(breadcrumbs), + (item) => item.customProps?.legacy + ) + const outdated = + !legacy && + _.some(_.initial(breadcrumbs), (item) => item.customProps?.outdated) + const hidden = _.some( + _.initial(breadcrumbs), + (item) => item.customProps?.hidden + ) if (outdated) { - const latest = _.last(_.takeWhile(breadcrumbs, item => !item.customProps?.outdated)).href + const latest = _.last( + _.takeWhile(breadcrumbs, (item) => !item.customProps?.outdated) + ).href admonitions.push( You are reading documentation for an outdated version. Here’s the{' '} @@ -36,16 +40,6 @@ export default function MDXContentWrapper(props) { ) } - if (offerings) { - const names = offerings.customProps.offerings.map(o => offeringNames[o]) - admonitions.push( - - This documentation only applies to {names.join(' and ')}. - See the feature comparison page for more information about the different Snowplow offerings. - - ) - } - return ( <> {(legacy || outdated || hidden) && ( diff --git a/static/_redirects b/static/_redirects index 51c361f00..af3bcc311 100644 --- a/static/_redirects +++ b/static/_redirects @@ -412,3 +412,8 @@ docs/understanding-tracking-design/managing-data-structures-with-data-structures /docs/destinations/warehouses-lakes/schemas-in-warehouse/* /docs/api-reference/loaders-storage-targets/schemas-in-warehouse/:splat 301 /docs/sources/trackers/google-tag-manager/snowtype/* /docs/data-product-studio/snowtype/working-with-gtm/ 301 + +# Removing BDP name +/docs/get-started/snowplow-bdp/* /docs/get-started/private-managed-cloud/:splat 301 +/docs/modeling-your-data/running-data-models-via-snowplow-bdp/* /docs/modeling-your-data/running-data-models-via-console/:splat 301 +/docs/get-started/snowplow-community-edition/* /docs/get-started/self-hosted/:splat 301 diff --git a/tutorials/abandoned-browse-ccdp/introduction.md b/tutorials/abandoned-browse-ccdp/introduction.md index b224d96ef..f578058bd 100644 --- a/tutorials/abandoned-browse-ccdp/introduction.md +++ b/tutorials/abandoned-browse-ccdp/introduction.md @@ -10,7 +10,7 @@ Abandoned browse is a common ecommerce problem where users show interest in prod --- -This tutorial demonstrates how to implement an abandoned browse tracking and re-engagement system using [Snowplow](https://snowplow.io/), [Snowflake](https://www.snowflake.com/), and [Census](https://www.getcensus.com/). This solution helps ecommerce businesses identify and re-engage users who have shown interest in a product (e.g., viewed something for 10+ seconds) but haven't proceeded further. +This tutorial demonstrates how to implement an abandoned browse tracking and re-engagement system using [Snowplow](https://snowplow.io/), [Snowflake](https://www.snowflake.com/), and [Census](https://www.getcensus.com/). This solution helps ecommerce businesses identify and re-engage users who have shown interest in a product (e.g., viewed something for 10+ seconds) but haven't proceeded further. --- ![Composable CDP](images/retl-snowplow-composable-cdp.png) @@ -20,10 +20,10 @@ This tutorial demonstrates how to implement an abandoned browse tracking and re- ## Prerequisites - An ecommerce website with a product catalog to track events from -- **Snowplow instance**: - - [Localstack](https://github.com/snowplow-incubator/snowplow-local) (recommended) - - [Community edition](/docs/get-started/snowplow-community-edition) - - BDP Enterprise if you're already a customer +- **Snowplow instance**: + - [Localstack](https://github.com/snowplow-incubator/snowplow-local) (recommended) + - [Community edition](/docs/get-started/snowplow-community-edition) + - Snowplow CDI if you're already a customer - **Access to a data warehouse**: e.g., [Snowflake](https://www.snowflake.com) - **Reverse ETL**: [Census Reverse ETL](https://www.getcensus.com) or Snowplow Reverse ETL - **Marketing automation platform**: e.g., [Braze](https://www.braze.com) diff --git a/tutorials/data-products-base-tracking/meta.json b/tutorials/data-products-base-tracking/meta.json index 0fc7a599c..d70be6a6b 100644 --- a/tutorials/data-products-base-tracking/meta.json +++ b/tutorials/data-products-base-tracking/meta.json @@ -1,8 +1,8 @@ { "title": "Track web events with base data products", "label": "Data governance", - "description": "Use the BDP Console to create and manage data products for a web application.", + "description": "Use the Snowplow Console to create and manage data products for a web application.", "useCases": ["Composable analytics"], "technologies": [], - "snowplowTech": ["Snowtype", "BDP Console"] + "snowplowTech": ["Snowtype", "Console"] } diff --git a/tutorials/data-structures-in-git/follow-up-with-data-products.md b/tutorials/data-structures-in-git/follow-up-with-data-products.md index 7e9362dec..08223462b 100644 --- a/tutorials/data-structures-in-git/follow-up-with-data-products.md +++ b/tutorials/data-structures-in-git/follow-up-with-data-products.md @@ -75,7 +75,7 @@ When we're happy with the proposed changes, we can publish by removing the `--dr snowplow-cli dp publish ``` -After publishing, you'll be able to see your new source application in the BDP Console UI. +After publishing, you'll be able to see your new source application in the Snowplow Console UI. ## Create a data product and an event specification diff --git a/tutorials/data-structures-in-git/introduction.md b/tutorials/data-structures-in-git/introduction.md index cebc6a4a8..c97081c2f 100644 --- a/tutorials/data-structures-in-git/introduction.md +++ b/tutorials/data-structures-in-git/introduction.md @@ -13,11 +13,11 @@ The Snowplow Console's UI has facilities to get started quickly with data struct A common solution when faced with these requirements is to move management to some form of version control platform (github/gitlab). This opens up an entire ecosystem of tools and patterns enabling all manner of custom workflows. -We have built [Snowplow CLI](https://docs.snowplow.io/docs/data-product-studio/snowplow-cli) to help you bridge the gap between these repository-based workflows and BDP Console. +We have built [Snowplow CLI](https://docs.snowplow.io/docs/data-product-studio/snowplow-cli) to help you bridge the gap between these repository-based workflows and Snowplow Console. ## Prerequisites -* A deployed Snowplow BDP pipeline +* A deployed Snowplow pipeline * [Snowplow CLI](https://docs.snowplow.io/docs/data-product-studio/snowplow-cli) installed and configured * A familiarity with [git](https://git-scm.com/) and an understanding of [GitHub Actions](https://docs.github.com/en/actions/writing-workflows) * A sensible [terminal emulator](https://en.wikipedia.org/wiki/Terminal_emulator) and shell @@ -25,5 +25,3 @@ We have built [Snowplow CLI](https://docs.snowplow.io/docs/data-product-studio/s ## What you'll be doing This tutorial will walk through creating and deploying a data structure from the command line using [Snowplow CLI](https://docs.snowplow.io/docs/data-product-studio/snowplow-cli). It will then show how it is possible to automate the validation and deployment process using [GitHub Actions](https://docs.github.com/en/actions/writing-workflows). - - diff --git a/tutorials/data-structures-in-git/local-setup.md b/tutorials/data-structures-in-git/local-setup.md index 1dab2eeea..eaa00b20d 100644 --- a/tutorials/data-structures-in-git/local-setup.md +++ b/tutorials/data-structures-in-git/local-setup.md @@ -56,7 +56,7 @@ data: ``` * `apiVersion` should always be `v1` * `resourceType` should remain `data-structure` -* `meta.hidden` directly relates to showing and hiding [in BDP Console UI](https://docs.snowplow.io/docs/data-product-studio/data-structures/manage/#hiding-a-data-structure) +* `meta.hidden` directly relates to showing and hiding [in Console UI](https://docs.snowplow.io/docs/data-product-studio/data-structures/manage/#hiding-a-data-structure) * `meta.schemaType` can be `event` or `entity` * `meta.customData` is a map of strings to strings that can be used to send across any key/value pairs you'd like to associate with the data structure * `data` is the actual [snowplow self describing schema](https://docs.snowplow.io/docs/api-reference/iglu/common-architecture/self-describing-json-schemas) that this data structure describes diff --git a/tutorials/data-structures-in-git/validation-and-publishing.md b/tutorials/data-structures-in-git/validation-and-publishing.md index 085f4a00b..af710f863 100644 --- a/tutorials/data-structures-in-git/validation-and-publishing.md +++ b/tutorials/data-structures-in-git/validation-and-publishing.md @@ -75,7 +75,7 @@ The command should output something close to the following: Publishing to `dev` will also run validation. It will only fail on ERROR notifications. ::: -You should now be able to see your published data structure in [BDP Console UI](https://console.snowplowanalytics.com/data-structures). If you click through from the data structure listing to view the `login` data structure you should see the following banner. +You should now be able to see your published data structure in [Console UI](https://console.snowplowanalytics.com/data-structures). If you click through from the data structure listing to view the `login` data structure you should see the following banner. ![](./images/locked.png) diff --git a/tutorials/kafka-live-viewer-profiles/introduction.md b/tutorials/kafka-live-viewer-profiles/introduction.md index 7ca14f165..a5b5a0b4f 100644 --- a/tutorials/kafka-live-viewer-profiles/introduction.md +++ b/tutorials/kafka-live-viewer-profiles/introduction.md @@ -34,7 +34,7 @@ The solution comprises several interconnected components: - Code available in [tracker-frontend](https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles/tree/main/tracker-frontend) folder in GitHub - **Snowplow Collector**: - - Collects and forwards events via [Stream Enrich](/docs/fundamentals/architecture-overview) and Kinesis to [Snowbridge](/docs/destinations/forwarding-events/snowbridge). + - Collects and forwards events via [Stream Enrich](/docs/fundamentals/architecture-overview) and Kinesis to [Snowbridge](/docs/api-reference/snowbridge). - **Snowplow Snowbridge**: - Publishes events to Kafka for the Live Viewer Backend to consume diff --git a/tutorials/signals-quickstart/meta.json b/tutorials/signals-quickstart/meta.json index 8cf538b91..da94bc477 100644 --- a/tutorials/signals-quickstart/meta.json +++ b/tutorials/signals-quickstart/meta.json @@ -4,5 +4,5 @@ "label": "Signals implementation", "useCases": ["Real-time personalization"], "technologies": ["Jupyter notebook"], - "snowplowTech": ["Signals", "BDP Console"] + "snowplowTech": ["Signals", "Console"] } diff --git a/tutorials/snowplow-cli-mcp/introduction.md b/tutorials/snowplow-cli-mcp/introduction.md index 038634761..4b5ec47bf 100644 --- a/tutorials/snowplow-cli-mcp/introduction.md +++ b/tutorials/snowplow-cli-mcp/introduction.md @@ -5,12 +5,12 @@ title: Getting started with the Snowplow MCP server for tracking design The Snowplow CLI MCP (Model Context Protocol) tool integrates Snowplow's data structure management capabilities directly into AI assistants like Claude. This enables natural language interaction for creating, validating, and managing your Snowplow tracking plans **locally**. -**Important**: The MCP tool creates and validates files on your local filesystem only. To sync changes to BDP Console, you'll use the regular CLI commands like `snowplow-cli ds publish` afterward. +**Important**: the MCP tool creates and validates files on your local filesystem only. To sync changes to Console, you'll use the regular CLI commands like `snowplow-cli ds publish` afterward. ## What you'll learn - How to set up the Snowplow CLI MCP tool with AI assistants like Claude Desktop, Cursor, or Copilot -- Available MCP tools and their functions +- Available MCP tools and their functions - Creating and validating data structures through conversation - AI-powered analysis for strategic tracking plan development @@ -30,9 +30,9 @@ The Snowplow CLI MCP (Model Context Protocol) tool integrates Snowplow's data st ## Prerequisites - Snowplow CLI installed ([installation guide](/docs/data-product-studio/snowplow-cli/#install)) -- Snowplow CLI configured with your BDP Console credentials ([configuration guide](/docs/data-product-studio/snowplow-cli/#configure)) +- Snowplow CLI configured with your Console credentials ([configuration guide](/docs/data-product-studio/snowplow-cli/#configure)) - Claude Desktop or another MCP-compatible client (Cursor or Copilot) -- **Filesystem access**: If using Claude Desktop, you must run alongside an MCP filesystem server (e.g., `@modelcontextprotocol/server-filesystem`) to enable file operations. Other MCP clients (Cursor, Copilot, etc.) have filesystem access by default. +- **Filesystem access**: if using Claude Desktop, you must run alongside an MCP filesystem server (e.g., `@modelcontextprotocol/server-filesystem`) to enable file operations. Other MCP clients (Cursor, Copilot, etc.) have filesystem access by default. ## Available MCP tools