-
Notifications
You must be signed in to change notification settings - Fork 236
chore: Add hdfs feature test job #2350
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #2350 +/- ##
============================================
+ Coverage 56.12% 57.42% +1.29%
- Complexity 976 1297 +321
============================================
Files 119 147 +28
Lines 11743 13415 +1672
Branches 2251 2347 +96
============================================
+ Hits 6591 7703 +1112
- Misses 4012 4451 +439
- Partials 1140 1261 +121 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
eefa69f
to
1aa1d47
Compare
artifact_name: ${{ matrix.os }}-java-${{ matrix.java_version }}-features-${{ matrix.features.value }}-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }} | ||
features: ${{ matrix.features.value }} | ||
maven_opts: "-Dtest=none -Dfeatures=${{ matrix.features.value }}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
-Dtest=none
skip java tests
@@ -1138,4 +1138,9 @@ abstract class CometTestBase | |||
usingDataSourceExec(conf) && | |||
!CometConf.COMET_SCAN_ALLOW_INCOMPATIBLE.get(conf) | |||
} | |||
|
|||
def featureEnabled(feature: String): Boolean = { | |||
System.getProperty("feature", "").split(",").contains(feature) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A temporary solution before #2372
@parthchandra Could you please take a look? |
{ | ||
val testFilePath = dir.toString | ||
writeTestParquetFile(testFilePath) | ||
withSQLConf(CometConf.COMET_NATIVE_SCAN_IMPL.key -> CometConf.SCAN_NATIVE_DATAFUSION) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO we also should check native_iceberg
_compat`
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, I will add native_iceberg_compat
mode
@@ -74,7 +74,9 @@ class ParquetReadFromFakeHadoopFsSuite extends CometTestBase with AdaptiveSparkP | |||
.startsWith(FakeHDFSFileSystem.PREFIX)) | |||
} | |||
|
|||
ignore("test native_datafusion scan on fake fs") { | |||
test("test native_datafusion scan on fake fs") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we also test hdfs
feature and native_iceberg_compat
mode?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we also test
hdfs
feature
hdfs
feature does not seem to support non-hdfs scheme yet, as comment #2360 (comment) .
datafusion-comet/native/fs-hdfs/src/hdfs.rs
Lines 844 to 865 in 72eb0e9
fn get_namenode_uri(path: &str) -> Result<String, HdfsErr> { | |
match Url::parse(path) { | |
Ok(url) => match url.scheme() { | |
LOCAL_FS_SCHEME => Ok("file:///".to_string()), | |
HDFS_FS_SCHEME | VIEW_FS_SCHEME => { | |
if let Some(host) = url.host() { | |
let mut uri_builder = String::new(); | |
write!(&mut uri_builder, "{}://{}", url.scheme(), host).unwrap(); | |
if let Some(port) = url.port() { | |
write!(&mut uri_builder, ":{port}").unwrap(); | |
} | |
Ok(uri_builder) | |
} else { | |
Err(HdfsErr::InvalidUrl(path.to_string())) | |
} | |
} | |
_ => Err(HdfsErr::InvalidUrl(path.to_string())), | |
}, | |
Err(_) => Err(HdfsErr::InvalidUrl(path.to_string())), | |
} | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thats correct, yes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's one of the fixes we wanted in fs-hdfs datafusion-contrib/fs-hdfs#29
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, I think at some point it might be more reasonable to use openDAL instead of fs-hdfs (#2367)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@@ -74,7 +74,9 @@ class ParquetReadFromFakeHadoopFsSuite extends CometTestBase with AdaptiveSparkP | |||
.startsWith(FakeHDFSFileSystem.PREFIX)) | |||
} | |||
|
|||
ignore("test native_datafusion scan on fake fs") { | |||
test("test native_datafusion scan on fake fs") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, I think at some point it might be more reasonable to use openDAL instead of fs-hdfs (#2367)
oops. Yes, I did mean #2372. Let's get that merged and update the test in this PR. |
Should we also be adding the test suites to pr_build_macos.yml ? |
Thanks, I will add it |
Which issue does this PR close?
Closes #.
Rationale for this change
What changes are included in this PR?
How are these changes tested?