Skip to content

Conversation

koletzilla
Copy link
Contributor

Summary

More updates to the dbt docs:

  • CI/CD suggestions.
  • How to troubleshooting long running operations.
  • Complete index docs.
  • Document projections.
  • Document typing complex types.

…mplete index docs. Document projections. Document typing complex types.
Copy link

vercel bot commented Oct 12, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
clickhouse-docs Ready Ready Preview Oct 13, 2025 5:32pm
3 Skipped Deployments
Project Deployment Preview Comments Updated (UTC)
clickhouse-docs-jp Ignored Ignored Oct 13, 2025 5:32pm
clickhouse-docs-ru Ignored Ignored Preview Oct 13, 2025 5:32pm
clickhouse-docs-zh Ignored Ignored Preview Oct 13, 2025 5:32pm

@koletzilla koletzilla changed the title CI/CD suggestions. How to troubleshooting long running operations. Co… More small additions for dbt docs Oct 12, 2025
@koletzilla koletzilla self-assigned this Oct 12, 2025
@koletzilla koletzilla marked this pull request as ready for review October 12, 2025 11:59
@koletzilla koletzilla requested review from a team as code owners October 12, 2025 11:59
To keep your development environments in sync and avoid running your models against stale deployments, you can use [clone](https://docs.getdbt.com/reference/commands/clone) or even [defer](https://docs.getdbt.com/reference/node-selection/defer).

It's better to use a different ClickHouse cluster (an `staging` one) to handle the testing phase. That way you can avoid impacting the performance of your production environment and the data there. You can keep a small subset of your production data there so you can run your models against it. There are different ways of handling this:
- If your data doesn't need to be really recent, you can load backups of your production data into the staging cluster.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This approach isn't very practical, since backups always require a new service to be created. Again, worth experimenting with it so we have a better idea of how this could work, in practice. For example: is it realistic to create a new service each time you need to refresh the data?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it realistic to create a new service each time you need to refresh the data?

In my mind it is but because the user may not need to refresh the data daily, maybe just once each several months. In that case it may be worth to just suggest to create this copy manually. If the user needs frequent updates, I guess we should then recommend going the other way with ClickPipes and stuff.

Still I don't have strong opinions on that. We can discuss it in the issue ClickHouse/dbt-clickhouse#547

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Closing this thread to move the conversation to the issue.

@koletzilla
Copy link
Contributor Author

Thanks @morsapaes for all the comments and suggestions in this PR. I really appreciate them!

@Blargian Blargian merged commit 710421a into main Oct 14, 2025
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants