-
Notifications
You must be signed in to change notification settings - Fork 387
More small additions for dbt docs #4562
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…mplete index docs. Document projections. Document typing complex types.
The latest updates on your projects. Learn more about Vercel for GitHub.
3 Skipped Deployments
|
docs/integrations/data-ingestion/etl-tools/dbt/features-and-configurations.md
Show resolved
Hide resolved
docs/integrations/data-ingestion/etl-tools/dbt/features-and-configurations.md
Outdated
Show resolved
Hide resolved
docs/integrations/data-ingestion/etl-tools/dbt/features-and-configurations.md
Outdated
Show resolved
Hide resolved
docs/integrations/data-ingestion/etl-tools/dbt/features-and-configurations.md
Show resolved
Hide resolved
docs/integrations/data-ingestion/etl-tools/dbt/features-and-configurations.md
Outdated
Show resolved
Hide resolved
To keep your development environments in sync and avoid running your models against stale deployments, you can use [clone](https://docs.getdbt.com/reference/commands/clone) or even [defer](https://docs.getdbt.com/reference/node-selection/defer). | ||
|
||
It's better to use a different ClickHouse cluster (an `staging` one) to handle the testing phase. That way you can avoid impacting the performance of your production environment and the data there. You can keep a small subset of your production data there so you can run your models against it. There are different ways of handling this: | ||
- If your data doesn't need to be really recent, you can load backups of your production data into the staging cluster. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This approach isn't very practical, since backups always require a new service to be created. Again, worth experimenting with it so we have a better idea of how this could work, in practice. For example: is it realistic to create a new service each time you need to refresh the data?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it realistic to create a new service each time you need to refresh the data?
In my mind it is but because the user may not need to refresh the data daily, maybe just once each several months. In that case it may be worth to just suggest to create this copy manually. If the user needs frequent updates, I guess we should then recommend going the other way with ClickPipes and stuff.
Still I don't have strong opinions on that. We can discuss it in the issue ClickHouse/dbt-clickhouse#547
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Closing this thread to move the conversation to the issue.
…nfigurations.md Co-authored-by: Marta Paes <[email protected]>
…nfigurations.md Co-authored-by: Marta Paes <[email protected]>
…nfigurations.md Co-authored-by: Marta Paes <[email protected]>
…nfigurations.md Co-authored-by: Marta Paes <[email protected]>
…nfigurations.md Co-authored-by: Marta Paes <[email protected]>
…nfigurations.md Co-authored-by: Marta Paes <[email protected]>
Co-authored-by: Marta Paes <[email protected]>
Co-authored-by: Marta Paes <[email protected]>
Co-authored-by: Marta Paes <[email protected]>
Co-authored-by: Bentsi Leviav <[email protected]>
Thanks @morsapaes for all the comments and suggestions in this PR. I really appreciate them! |
Summary
More updates to the dbt docs: