Added KDTree support for KNN. Added tests/time benchmarks for new knn model. #26
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Add KD-Tree Alternative for KNN Alignment
Overview
This PR introduces an alternative implementation for K-Nearest Neighbors (KNN) alignment in the depth completion pipeline using KD-trees, using torch_kdtree This is to improve inference time. Users can now choose between the existing
torch_clusterimplementation and the new KD-tree approach.Changes
Core Implementation
_knn_aligns_kdtree()method as an alternative to_knn_aligns_torch_cluster()knn_aligns()to dispatch between implementations based on thekd_treeparameterforward()andkss_completer()methods to accept thekd_treeboolean parameterThis change is fully backward compatible. Existing code will continue to work unchanged, if setting the
kd_treeparameter to false.API Changes
Performance Characteristics
Performance varies based on image resolution and sparsity levels. The KD-tree implementation can provide substantial speedups for high-resolution images:
Example Performance (1920x1280 resolution):
However, for smaller images,
torch_clustermay be faster.Validation and Testing
Benchmark Script
Benchmark script
benchmark_knn_direct.pyis included to measure execution time and correctness validation.Usage:
python benchmark_knn_direct.py \ --sparsity 0.60 0.90 \ --runs 5 \ --K 5 \ --height 1280 \ --width 1920Parameters:
--sparsity: List of sparsity ratios to test (default: [0.60, 0.90])--runs: Number of benchmark runs per test (default: 5)--K: Number of nearest neighbors (default: 5)--height: Image height for testing (default: 1280)--width: Image width for testing (default: 1920)Correctness Validation
The benchmark includes distance validation to ensure both implementations produce mathematically equivalent results. Any differences are due to legitimate tie-breaking when multiple points are equidistant, which can lead to different neighbors being returned. Hence, the correctness tests check for three conditions:
Dependencies
The KD-tree implementation requires:
Or install from source, which is what I did and tested with, and which is why I have not added it to the requirements.
Use the provided benchmark script to make an informed decision for your use case.
Addresses #4, #22 , and maybe #23.
cc @Ammoniro @HeoeH