Add Spark 4 protobuf definitions and basic working#11
Open
hntd187 wants to merge 6 commits intoapache:masterfrom
Open
Add Spark 4 protobuf definitions and basic working#11hntd187 wants to merge 6 commits intoapache:masterfrom
hntd187 wants to merge 6 commits intoapache:masterfrom
Conversation
Member
|
@xuanyuanking Is this PR still WIP as mentioned in the description right? So, I marked this as draft. |
Contributor
Author
|
I'd just like to merge this to get moving on some follow up PRs so I asked to have this reviewed. |
LuciferYang
reviewed
Feb 13, 2026
| fn from(value: i16) -> Self { | ||
| spark::expression::Literal { | ||
| #[cfg(feature = "spark-41")] | ||
| data_type: Some(spark::DataType::from(5)), |
There was a problem hiding this comment.
Using hard-coded numbers to represent Spark data types results in poor readability and maintainability.
| value.signed_duration_since(chrono::NaiveDate::from_ymd_opt(1970, 1, 1).unwrap()); | ||
|
|
||
| spark::expression::Literal { | ||
| data_type: Some(spark::DataType::from(14)), |
There was a problem hiding this comment.
Isn't #[cfg(feature = "spark-41")] needed here?
| Ok(()) | ||
| } | ||
|
|
||
| #[tokio::test] |
There was a problem hiding this comment.
Will this test case be rewritten using the Arrow C FFI interface in the future?
| common: Some(RelationCommon { | ||
| source_info: "".to_string(), | ||
| plan_id: Some(plan_id), | ||
| #[cfg(spark41)] |
There was a problem hiding this comment.
#[cfg(feature = "spark-41")]
let relation = Relation {
common: Some(RelationCommon {
source_info: "".to_string(),
plan_id: Some(plan_id),
origin: None,
}),
rel_type: Some(rel_type),
};
#[cfg(not(feature = "spark-41"))]
let relation = Relation {
common: Some(RelationCommon {
source_info: "".to_string(),
plan_id: Some(plan_id),
}),
rel_type: Some(rel_type),
};
Maybe it could be changed like this? Would this allow us to avoid defining spark41?
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
Adding the changed protobuf (and new) definitions for Spark 4 this will allow us to implement the new Spark 4 capabilities. This is still a WIP
Why are the changes needed?
Spark updates things, so do we.
Does this PR introduce any user-facing change?
Yes, some interfaces will change, but for the most part user code that worked prior should still work.
How was this patch tested?
By Running the tests, but running this on windows has always been difficult to see how well it actually works.
Was this patch authored or co-authored using generative AI tooling?
No.