-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable offline install of labs projects #2049
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
@pietern Have addressed your comments. Please could you review and send for int tests? |
return nil, file_err | ||
} | ||
fileName := filepath.Join(libDir, "labs.yml") | ||
raw, err = os.ReadFile(fileName) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please document how you expect this to work.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pietern have addressed your other comments. As this is my first, could you please help where should I update the documentation for cli. Basically the idea is, for system with no access to internet
- install a labs project on a machine which has internal
- zip and copy the file to the intended machine and run databricks labs install --offline=true
- it will look for the code in the same install directory and if present load from there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. Please include this as a comment in the test as well as the PR description.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pietern have updated the PR please check
…tion (databricks#2043) ## Changes Changes the warning about an incomplete / implicit permissions section into a recommendation, and does a minor bit of cleanup. ## Tests New unit test.
## Changes - Instead of collecting requests in memory and writing them at the end of the test, write them right away. Then test authors can do filtering with jq in 'script' or collect individual files per different command. - testserver is now simpler - it just calls a caller-provided function. The logging logic is moved to acceptance_test.go. See https://github.com/databricks/cli/pull/2359/files#r1967591173 ## Tests Existing tests.
…ource reference doc template (databricks#2314) ## Changes - Added some missing descriptions to annotations.yml - Fixed links in the resource reference doc template ## Tests <!-- How is this tested? -->
## Changes Previously, one could not set `LocalOnly=true` in parent directory and then override it with `LocalOnly=false` in child directory. This is because, `false` is considered empty value by mergo. In order to distinguish between 'explicitly set to false' and 'not set', I've changed all simple variables in config to be pointers. Now, one can always override those, because non-nil pointers are not null (with mergo.WithoutDereference option). ## Tests Manually: ``` ~/work/cli/acceptance/bundle/templates/default-python % cat test.toml # add this new file LocalOnly = false ~/work/cli/acceptance/bundle/templates/default-python % CLOUD_ENV=aws go test ../../.. -run ^TestAccept$/^bundle$/^templates$/^default-python$ -v (the test is no longer skipped) ```
…icks#2356) ## Changes Added missing .gitignore files to templates ## Tests There were some incorrect snapshots of gitignore files in acceptance tests, probably generated by testing infra. Updated them to new files --------- Co-authored-by: Lennart Kats (databricks) <[email protected]>
… to 0.58.1 (databricks#2357) Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.57.0 to 0.58.1. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.58.1</h2> <h3>Internal Changes</h3> <ul> <li>Do not send ForceSendFields as query parameters.</li> </ul> <h2>v0.58.0</h2> <h2>[Release] Release v0.58.0</h2> <h3>New Features and Improvements</h3> <ul> <li>Enable async refreshes for OAuth tokens (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1143">#1143</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add support for asynchronous data plane token refreshes (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1142">#1142</a>).</li> <li>Introduce new TokenSource interface that takes a <code>context.Context</code> (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1141">#1141</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <code>GetMessageQueryResultByAttachment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GenieAPI">w.Genie</a> workspace-level service.</li> <li>Added <code>Id</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#App">apps.App</a>.</li> <li>Added <code>LimitConfig</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#UpdateBudgetPolicyRequest">billing.UpdateBudgetPolicyRequest</a>.</li> <li>Added <code>Volumes</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterLogConf">compute.ClusterLogConf</a>.</li> <li>Removed <code>ReviewState</code>, <code>Reviews</code> and <code>RunnerCollaborators</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook">cleanrooms.CleanRoomAssetNotebook</a>.</li> </ul> <p>OpenAPI SHA: 99f644e72261ef5ecf8d74db20f4b7a1e09723cc, Date: 2025-02-11</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>[Release] Release v0.58.1</h2> <h3>Internal Changes</h3> <ul> <li>Do not send ForceSendFields as query parameters.</li> </ul> <h2>[Release] Release v0.58.0</h2> <h3>New Features and Improvements</h3> <ul> <li>Enable async refreshes for OAuth tokens (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1143">#1143</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add support for asynchronous data plane token refreshes (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1142">#1142</a>).</li> <li>Introduce new TokenSource interface that takes a <code>context.Context</code> (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1141">#1141</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <code>GetMessageQueryResultByAttachment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GenieAPI">w.Genie</a> workspace-level service.</li> <li>Added <code>Id</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#App">apps.App</a>.</li> <li>Added <code>LimitConfig</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#UpdateBudgetPolicyRequest">billing.UpdateBudgetPolicyRequest</a>.</li> <li>Added <code>Volumes</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterLogConf">compute.ClusterLogConf</a>.</li> <li>Added .</li> <li>Removed <code>ReviewState</code>, <code>Reviews</code> and <code>RunnerCollaborators</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook">cleanrooms.CleanRoomAssetNotebook</a>.</li> </ul> <p>OpenAPI SHA: 99f644e72261ef5ecf8d74db20f4b7a1e09723cc, Date: 2025-02-11</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/967d0632b7676ca14b3dae154dcc2f727f4350c6"><code>967d063</code></a> [Release] Release v0.58.1 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1146">#1146</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/9dc3c56fb0afb65e8597f205db087fb2c6cca21d"><code>9dc3c56</code></a> [Release] Release v0.58.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1144">#1144</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/8307a4d467368f6a4290cba179a334d1f816ebd6"><code>8307a4d</code></a> [Feature] Enable async refreshes for OAuth tokens (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1143">#1143</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/815cace601ed08e11794d9c20e8c42e6af376f4a"><code>815cace</code></a> [Internal] Add support for asynchronous data plane token refreshes (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1142">#1142</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/3aebd68bf334b94e63974963cd967f836b559a48"><code>3aebd68</code></a> [Internal] Introduce new TokenSource interface that takes a <code>context.Context</code>...</li> <li>See full diff in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.57.0...v0.58.1">compare view</a></li> </ul> </details> <br /> <details> <summary>Most Recent Ignore Conditions Applied to This Pull Request</summary> | Dependency Name | Ignore Conditions | | --- | --- | | github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] | </details> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <[email protected]>
## Changes 1. add ".idea" entry to ".gitgnore" ## Tests 1. manually checked that with this change files under .idea/ folder are not staged for commit
## Changes - Get rid of artifacts.DetectPackages which is a thin wrapper around artifacts/whl.DetectPackage - Get rid of parsing name out of setup.py. Do not randomize either, use a static one. ## Tests Existing tests.
…n flag (databricks#2373) ## Changes CLI generation template was using RequiredPathField from incorrect request entity (body field from request and not request itself). Thus for some of the commands required path parameters were not required when --json was specified. ## Tests Regenerated commands work correctly
…cks#2358) Bumps [github.com/spf13/cobra](https://github.com/spf13/cobra) from 1.8.1 to 1.9.1. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/spf13/cobra/releases">github.com/spf13/cobra's releases</a>.</em></p> <blockquote> <h2>v1.9.1</h2> <h3>🐛 Fixes</h3> <ul> <li>Fix CompletionFunc implementation by <a href="https://github.com/ccoVeille"><code>@ccoVeille</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2234">spf13/cobra#2234</a></li> <li>Revert "Make detection for test-binary more universal (<a href="https://redirect.github.com/spf13/cobra/issues/2173">#2173</a>)" by <a href="https://github.com/marckhouzam"><code>@marckhouzam</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2235">spf13/cobra#2235</a></li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/spf13/cobra/compare/v1.9.0...v1.9.1">https://github.com/spf13/cobra/compare/v1.9.0...v1.9.1</a></p> <h2>v1.9.0</h2> <h2>✨ Features</h2> <ul> <li>Allow linker to perform deadcode elimination for program using Cobra by <a href="https://github.com/aarzilli"><code>@aarzilli</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/1956">spf13/cobra#1956</a></li> <li>Add default completion command even if there are no other sub-commands by <a href="https://github.com/marckhouzam"><code>@marckhouzam</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/1559">spf13/cobra#1559</a></li> <li>Add CompletionWithDesc helper by <a href="https://github.com/ccoVeille"><code>@ccoVeille</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2231">spf13/cobra#2231</a></li> </ul> <h2>🐛 Fixes</h2> <ul> <li>Fix deprecation comment for Command.SetOutput by <a href="https://github.com/thaJeztah"><code>@thaJeztah</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2172">spf13/cobra#2172</a></li> <li>Replace deprecated ioutil usage by <a href="https://github.com/nirs"><code>@nirs</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2181">spf13/cobra#2181</a></li> <li>Fix --version help and output for plugins by <a href="https://github.com/nirs"><code>@nirs</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2180">spf13/cobra#2180</a></li> <li>Allow to reset the templates to the default by <a href="https://github.com/marckhouzam"><code>@marckhouzam</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2229">spf13/cobra#2229</a></li> </ul> <h2>🤖 Completions</h2> <ul> <li>Make Powershell completion work in constrained mode by <a href="https://github.com/lstemplinger"><code>@lstemplinger</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2196">spf13/cobra#2196</a></li> <li>Improve detection for flags that accept multiple values by <a href="https://github.com/thaJeztah"><code>@thaJeztah</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2210">spf13/cobra#2210</a></li> <li>add CompletionFunc type to help with completions by <a href="https://github.com/ccoVeille"><code>@ccoVeille</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2220">spf13/cobra#2220</a></li> <li>Add similar whitespace escape logic to bash v2 completions than in other completions by <a href="https://github.com/kangasta"><code>@kangasta</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/1743">spf13/cobra#1743</a></li> <li>Print ActiveHelp for bash along other completions by <a href="https://github.com/marckhouzam"><code>@marckhouzam</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2076">spf13/cobra#2076</a></li> <li>fix(completions): Complete map flags multiple times by <a href="https://github.com/gabe565"><code>@gabe565</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2174">spf13/cobra#2174</a></li> <li>fix(bash): nounset unbound file filter variable on empty extension by <a href="https://github.com/scop"><code>@scop</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2228">spf13/cobra#2228</a></li> </ul> <h2>🧪 Testing</h2> <ul> <li>Test also with go 1.23 by <a href="https://github.com/nirs"><code>@nirs</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2182">spf13/cobra#2182</a></li> <li>Make detection for test-binary more universal by <a href="https://github.com/thaJeztah"><code>@thaJeztah</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2173">spf13/cobra#2173</a></li> </ul> <h2>✍🏼 Documentation</h2> <ul> <li>docs: update README.md by <a href="https://github.com/eltociear"><code>@eltociear</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2197">spf13/cobra#2197</a></li> <li>Improve site formatting by <a href="https://github.com/nirs"><code>@nirs</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2183">spf13/cobra#2183</a></li> <li>doc: add Conduit by <a href="https://github.com/raulb"><code>@raulb</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2230">spf13/cobra#2230</a></li> <li>doc: azion project added to the list of CLIs that use cobra by <a href="https://github.com/maxwelbm"><code>@maxwelbm</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2198">spf13/cobra#2198</a></li> <li>Fix broken links in active_help.md by <a href="https://github.com/vuil"><code>@vuil</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2202">spf13/cobra#2202</a></li> <li>chore: fix function name in comment by <a href="https://github.com/zhuhaicity"><code>@zhuhaicity</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2216">spf13/cobra#2216</a></li> </ul> <h2>🔧 Dependency upgrades</h2> <ul> <li>build(deps): bump github.com/cpuguy83/go-md2man/v2 from 2.0.5 to 2.0.6 by <a href="https://github.com/thaJeztah"><code>@thaJeztah</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2206">spf13/cobra#2206</a></li> <li>Update to latest go-md2man by <a href="https://github.com/mikelolasagasti"><code>@mikelolasagasti</code></a> in <a href="https://redirect.github.com/spf13/cobra/pull/2201">spf13/cobra#2201</a></li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/spf13/cobra/commit/40b5bc1437a564fc795d388b23835e84f54cd1d1"><code>40b5bc1</code></a> Revert "Make detection for test-binary more universal (<a href="https://redirect.github.com/spf13/cobra/issues/2173">#2173</a>)" (<a href="https://redirect.github.com/spf13/cobra/issues/2235">#2235</a>)</li> <li><a href="https://github.com/spf13/cobra/commit/a97f9fd47b290016526c8af2dac0531fea5cd773"><code>a97f9fd</code></a> fix CompletionFunc implementation (<a href="https://redirect.github.com/spf13/cobra/issues/2234">#2234</a>)</li> <li><a href="https://github.com/spf13/cobra/commit/5f9c40898e795a9fb0fd5ca83b6e05c3720523d1"><code>5f9c408</code></a> chore: Upgrade dependencies for v1.9.0 (<a href="https://redirect.github.com/spf13/cobra/issues/2233">#2233</a>)</li> <li><a href="https://github.com/spf13/cobra/commit/24ada7fe71e3a3a8741dd52e0a7fc3b97450535a"><code>24ada7f</code></a> Remove the default "completion" cmd if it is alone (<a href="https://redirect.github.com/spf13/cobra/issues/1559">#1559</a>)</li> <li><a href="https://github.com/spf13/cobra/commit/680936a2200be363c61feda8cd29287f0726a48c"><code>680936a</code></a> New logo</li> <li><a href="https://github.com/spf13/cobra/commit/8cb30f9ca53a004a6fe88c5cfcc79ac7b24fc638"><code>8cb30f9</code></a> feat: add CompletionWithDesc helper (<a href="https://redirect.github.com/spf13/cobra/issues/2231">#2231</a>)</li> <li><a href="https://github.com/spf13/cobra/commit/17b6dca2ffaf6113cbd1cf433ec988fa7d63c6f3"><code>17b6dca</code></a> doc: add Conduit (<a href="https://redirect.github.com/spf13/cobra/issues/2230">#2230</a>)</li> <li><a href="https://github.com/spf13/cobra/commit/ab5cadcc1bbe224b329726fc5f8b99d6f93e9805"><code>ab5cadc</code></a> Allow to reset the templates to the default (<a href="https://redirect.github.com/spf13/cobra/issues/2229">#2229</a>)</li> <li><a href="https://github.com/spf13/cobra/commit/4ba5566f5704a9c0d205e1ef3efc4896156d33fa"><code>4ba5566</code></a> fix(bash): nounset unbound file filter variable on empty extension (<a href="https://redirect.github.com/spf13/cobra/issues/2228">#2228</a>)</li> <li><a href="https://github.com/spf13/cobra/commit/41b26ec8bb59dfba580f722201bf371c4f5703dd"><code>41b26ec</code></a> Print ActiveHelp for bash along other completions (<a href="https://redirect.github.com/spf13/cobra/issues/2076">#2076</a>)</li> <li>Additional commits viewable in <a href="https://github.com/spf13/cobra/compare/v1.8.1...v1.9.1">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <[email protected]>
…tabricks#2374) ## Changes Previously using python wheel tasks in the tasks with compute referering to interactive cluster defied in the same bundle would produce a warning like below ``` GET /api/2.1/clusters/get?cluster_id=${resources.clusters.development_cluster.id} < HTTP/2.0 400 Bad Request < { < "error_code": "INVALID_PARAMETER_VALUE", < "message": "Cluster ${resources.clusters.development_cluster.id} does not exist" < } pid=14465 mutator=seq mutator=initialize mutator=seq mutator=PythonWrapperWarning sdk=true ``` This PR fixes it by making sure that we check spark version for such clusters based on its bundle configuration and don't make API calls ## Tests Added acceptance test
) ## Changes Added PyPi and Maven libraries tests Needed for this PR since we don't currently have any coverage for PyPi or Maven libraries databricks#2382
## Changes This PR: 1. No longer sets the `DATABRICKS_CLI_PARENT_PID` environment variable since it was never required in the first place and was mistakenly merged in the inital PR. 2. Performs minor cleanup based on post merge feedback in databricks#2354. ## Tests N/A
…atabricks#2372) ## Changes 1. Change the **default-python** bundle template to set `data_security_mode` of a cluster to SINGLE_USER 2. Change the **experimental-jobs-as-code** bundle template to set `data_security_mode` of a cluster to SINGLE_USER ## Why Explicitly adding this field saves experienced users from confusion onto what security mode is applied to the cluster ## Tests Changed existing unit and integration tests to pass with this change
…ed (databricks#2335) ## Changes Now when `profile` flag is used we won't pick up host from bundle anymore and use the one provided by -p flag Previous behaviour in the context of bundle ``` databricks current-user me -p profile_name Error: cannot resolve bundle auth configuration: config host mismatch: profile uses host https://non-existing-subdomain.databricks.com, but CLI configured to use https://foo.com ``` New behaviour (make an api call) ``` databricks current-user me -p profile_name { email: "[email protected]" ... } ``` We still load bundle configuration when `-t` flag provide because we want to load host information from the target. Fixes databricks#1358 ## Tests Added acceptance test
…atabricks#2377) ## Changes - Add 'serverless' prompt to default-python template (default is currently set to "no"). - This is a simplified version of databricks#2348 with 'auto' functionality removed. ## Tests - Split default-python into default-python/classic, default-python/serverless, default-python/serverless-customcatalog. - Manually check that "bundle init default-python" with serverless=yes can be deployed and run on dogfood and test env.
## Changes Fix diff.py to apply replacements that have newlines in them. ## Tests Existing tests.
…ks#2386) ## Changes Since at this moment we set default to 'no', interactively it should also default to 'no'. However, it just uses the first option. ## Tests Manually running `cli bundle init default-python`
…2388) ## Tests Manually running 'bundle init default-python' - no question about serverless.
## Changes Instead of LocalOnly with non-composable semantics there are two composable options: - Local - enable test locally - Cloud - enable test on the cloud By default Cloud is switched off except in bundle (but not in bundle/variables and bundle/help). ## Tests Using this in databricks#2383 to have test that runs on cloud but not locally.
## Tests Include full output of default-python/classic so it can be used as a base for diffs in cloud tests databricks#2383
Notable changes: Starting this version CLI does not load bundle auth information when CLI command is executed inside the bundle directory with explicitly provided via `-p` flag profile. For more details see the related GitHub issue databricks#1358 CLI: * Do not load host from bundle for CLI commands when profile flag is used ([databricks#2335](databricks#2335)). * Fixed accessing required path parameters in CLI generation when --json flag ([databricks#2373](databricks#2373)). Bundles: * Provide instructions for testing in the default-python template ([databricks#2355](databricks#2355)). * Remove `run_as` from the built-in templates ([databricks#2044](databricks#2044)). * Change warning about incomplete permissions section into a recommendation ([databricks#2043](databricks#2043)). * Refine `mode: production` diagnostic output ([databricks#2236](databricks#2236)). * Support serverless mode in default-python template (explicit prompt) ([databricks#2377](databricks#2377)). * Set default data_security_mode to "SINGLE_USER" in bundle templates ([databricks#2372](databricks#2372)). * Fixed spark version check for clusters defined in the same bundle ([databricks#2374](databricks#2374)). API Changes: * Added `databricks genie get-message-query-result-by-attachment` command. OpenAPI commit 99f644e72261ef5ecf8d74db20f4b7a1e09723cc (2025-02-11)
…nctions (databricks#2390) ## Changes - Instead of constructing chains of mutators and then executing them, execute them directly. - Remove functionality related to chain-building: Seq, If, Defer, newPhase, logString. - Phases become functions that apply the changes directly rather than construct mutator chains that will be called later. - Add a helper ApplySeq to call multiple mutators, use it where Apply+Seq were used before. This is intended to be a refactoring without functional changes, but there are a few behaviour changes: - Since defer() is used to call unlock instead of bundle.Defer() unlocking will now happen even in case of panics. - In --debug, the phase names are are still logged once before start of the phase but each entry no longer has 'seq' or phase name in it. - The message "Deployment complete!" was printed even if terraform.Apply() mutator had an error. It no longer does that. ## Motivation The use of the chains was necessary when mutators were returning a list of other mutators instead of calling them directly. But that has since been removed, so now the chain machinery have no purpose anymore. Use of direct functions simplifies the logic and makes bugs more apparent and easy to fix. Other improvements that this unlocks: - Simpler stacktraces/debugging (breakpoints). - Use of functions with narrowly scoped API: instead of mutators that receive full bundle config, we can use focused functions that only deal with sections they care about prepareGitSettings(currentGitSection) -> updatedGitSection. This makes the data flow more apparent. - Parallel computations across mutators (within phase): launch goroutines fetching data from APIs at the beggining, process them once they are ready. ## Tests Existing tests.
…#2394) ## Tests Manually, I have a test that fails. Before: ``` === NAME TestAccept server.go:195: ---------------------------------------- No stub found for pattern: GET /api/2.1/clusters/get To stub a response for this request, you can add the following to test.toml: [[Server]] Pattern = "GET /api/2.1/clusters/get" Response.Body = ''' <response body here> ''' Response.StatusCode = <response status-code here> ---------------------------------------- ``` After: ``` server.go:203: No handler for URL: /api/2.1/clusters/get?cluster_id=0717-132531-5opeqon1 Body: [0 bytes] For acceptance tests, add this to test.toml: [[Server]] Pattern = "GET /api/2.1/clusters/get" Response.Body = '<response body here>' # Response.StatusCode = <response code if not 200> ```
## Changes This PR adds a recovery function for panics. This indicates to all users running into a panic that it's a bug and they should report it to Databricks. ## Tests Manually and acceptance test. Before: ``` .venv➜ cli git:(panic-r) ✗ ./cli selftest panic panic: the databricks selftest panic command always panics goroutine 1 [running]: github.com/databricks/cli/cmd/selftest.New.newPanic.func1(0x1400016f208?, {0x1016ca925?, 0x4?, 0x1016ca929?}) /Users/shreyas.goenka/cli2/cli/cmd/selftest/panic.go:9 +0x2c github.com/spf13/cobra.(*Command).execute(0x1400016f208, {0x10279bc40, 0x0, 0x0}) /Users/shreyas.goenka/cli2/cli/vendor/github.com/spf13/cobra/command.go:989 +0x81c github.com/spf13/cobra.(*Command).ExecuteC(0x14000428908) /Users/shreyas.goenka/cli2/cli/vendor/github.com/spf13/cobra/command.go:1117 +0x344 github.com/spf13/cobra.(*Command).ExecuteContextC(...) /Users/shreyas.goenka/cli2/cli/vendor/github.com/spf13/cobra/command.go:1050 github.com/databricks/cli/cmd/root.Execute({0x101d60440?, 0x10279bc40?}, 0x10266dd78?) /Users/shreyas.goenka/cli2/cli/cmd/root/root.go:101 +0x58 main.main() /Users/shreyas.goenka/cli2/cli/main.go:13 +0x44 ``` After: ``` .venv➜ cli git:(panic-r) ./cli selftest panic The Databricks CLI unexpectedly had a fatal error. Please report this issue to Databricks in the form of a GitHub issue at: https://github.com/databricks/cli CLI Version: 0.0.0-dev+aae7ced52d36 Panic Payload: the databricks selftest panic command always panics Stack Trace: goroutine 1 [running]: runtime/debug.Stack() /Users/shreyas.goenka/go/pkg/mod/golang.org/[email protected]/src/runtime/debug/stack.go:26 +0x64 github.com/databricks/cli/cmd/root.Execute.func1() /Users/shreyas.goenka/cli2/cli/cmd/root/root.go:110 +0xa4 panic({0x10368b5e0?, 0x1039d6d70?}) /Users/shreyas.goenka/go/pkg/mod/golang.org/[email protected]/src/runtime/panic.go:785 +0x124 github.com/databricks/cli/cmd/selftest.New.newPanic.func1(0x14000145208?, {0x103356be5?, 0x4?, 0x103356be9?}) /Users/shreyas.goenka/cli2/cli/cmd/selftest/panic.go:9 +0x2c github.com/spf13/cobra.(*Command).execute(0x14000145208, {0x104427c40, 0x0, 0x0}) /Users/shreyas.goenka/cli2/cli/vendor/github.com/spf13/cobra/command.go:989 +0x81c github.com/spf13/cobra.(*Command).ExecuteC(0x14000400c08) /Users/shreyas.goenka/cli2/cli/vendor/github.com/spf13/cobra/command.go:1117 +0x344 github.com/spf13/cobra.(*Command).ExecuteContextC(...) /Users/shreyas.goenka/cli2/cli/vendor/github.com/spf13/cobra/command.go:1050 github.com/databricks/cli/cmd/root.Execute({0x1039ec440?, 0x104427c40?}, 0x14000400c08) /Users/shreyas.goenka/cli2/cli/cmd/root/root.go:128 +0x94 main.main() /Users/shreyas.goenka/cli2/cli/main.go:13 +0x44 ```
…s.yml (databricks#2389) ## Changes Defining an include section in config files other than the main `databricks.yml` file fails silently. With this PR users will get a warning when they try this. ## Tests Acceptance test.
…bricks#2399) ## Changes This PR adds a warning which gives users clear guidance when they try to use variable interpolation for an auth field. ## Tests Modify existing acceptance test.
…basename used (databricks#2382) ## Changes It could happen that there are multiple artifacts defined in the bundle which build and therefore deploy wheel packages with the same name. This leads to conflict between these packages, they will overwrite each other and therefore they should have different names instead Fixes databricks#1674 Previous attempt (databricks#2297 + databricks#2341) led to the breakage, this PR fixes both issues. ## Tests Added acceptance test
… Java 8 (databricks#2385) ## Changes 1. Refactored `TestSparkJarTaskDeployAndRunOnVolumes` and `TestSparkJarTaskDeployAndRunOnWorkspace` to use a table-driven approach for better organization of similar tests 2. Implemented `testutil.HasJDK()` to replace `testutil.RequireJDK` to be able to skip tests 3. Ensured the test suite properly fails if no compatible Java version is found ## Why It can be tricky to have Java 8 installed on modern dev environments (e.g. Mac on Apple M3 chip). The absence of which previously caused the Spark Jar task tests to fail when run locally. This refactoring allows such environments to be able to run "SparkJar" tests using a newer Databricks Runtime. ## Tests 1. Ran `TestSparkJarTaskDeployAndRunOnVolumes` and `TestSparkJarTaskDeployAndRunOnWorkspace` locally on Mac with Java11 installed. 2. Checked that tests against older runtimes are still being run and passing in CI/CD environments
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm from the code perspective - we need to add a doc how to use it.
An authorized user can trigger integration tests manually by following the instructions below: Trigger: Inputs:
Checks will be approved automatically on success. |
Changes
This PR makes changes to the labs code base to allow for offline installation of labs projects (like UCX). By passing a flag --offline=true, the code will skip checking for project versions and download code from GitHub and instead will look from the local installation folder. This cmd is useful in systems where there is internet restriction, the user should follow a set-up as follows:
it will look for the code in the same install directory and if present load from there.
Closes #1646
related to databrickslabs/ucx#3418
Tests
Added unit test case and tested.
NO_CHANGELOG=true