-
Notifications
You must be signed in to change notification settings - Fork 13.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FLINK-14526][hive] Support Hive version 1.1.0 and 1.1.1 #9995
Conversation
Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community Automated ChecksLast check on commit 2f28e38 (Wed Dec 04 14:51:54 UTC 2019) Warnings:
Mention the bot in a comment to re-run the automated checks. Review Progress
Please see the Pull Request Review Guide for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commandsThe @flinkbot bot supports the following commands:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
Btw, can you confirm all the test profiles have passed with this change?
String className; | ||
switch (primitiveTypeInfo.getPrimitiveCategory()) { | ||
case BOOLEAN: | ||
className = "org.apache.hadoop.hive.serde2.objectinspector.primitive.JavaConstantBooleanObjectInspector"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
curious why does 1.1.0 check WritableConstant and 1.1.1 check JavaConstant?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because java constant object inspectors are not available until 1.2.0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The current inheritance pattern is that later version hive shim inherits from the previous version unless there is a compelling reason. This is to reduce code duplication.
However, HiveShimV120 isn't inherited from HiveShimV111.
It's good if @bowenli86 can also review.
import java.util.List; | ||
import java.util.Map; | ||
|
||
/** | ||
* A shim layer to support different versions of Hive. | ||
*/ | ||
public interface HiveShim { | ||
public interface HiveShim extends Serializable { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's probably better to leave this change (about serialization) to a dedicated PR with relevant tests. It makes review easier and limits the scope of this change.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test will fail if HiveShim is not serializable. Do you mean we should first open and merge a dedicated PR for this and come back later?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does the test fail before this PR, as HiveShim is not serializable before. If we need the fix to pass the test, we can put in the same PR to reduce the overhead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests pass before this PR only because we don't have enough test coverage to expose the issue. Before this PR, HiveShim is only used to get object inspectors for DATE and TIMESTAMP columns, while with the PR, HiveShim is needed for all columns. So we need the fix to pass the tests.
...rs/flink-connector-hive/src/test/java/org/apache/flink/table/catalog/hive/HiveTestUtils.java
Outdated
Show resolved
Hide resolved
...rs/flink-connector-hive/src/test/java/org/apache/flink/table/catalog/hive/HiveTestUtils.java
Outdated
Show resolved
Hide resolved
...ctor-hive/src/main/java/org/apache/flink/table/functions/hive/conversion/HiveInspectors.java
Outdated
Show resolved
Hide resolved
5f0728f
to
fe532e1
Compare
PR updated to address comments. Also verified tests pass for all our profiles. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 looks good to me on my side.
959bd5e
to
2f28e38
Compare
Rebased. And noted INTERVAL |
LGTM, thanks for your contribution! Merging |
What is the purpose of the change
To support Hive 1.1.0 and 1.1.1
Brief change log
HiveShim
serializable because it's needed inHiveGenericUDF
andHiveGenericUDTF
. This change should have been made before this PR. We were good only because there's no test to expose this issue.Writable
s for Hive UDFs if they're initialized with some constant arguments. Added a conversion from Hive Java primitive to Hive Writable primitive for this.Verifying this change
Covered by existing test cases.
Manually verified by running
mvn verify -Phive-1.1.0
andmvn verify -Phive-1.1.1
.Does this pull request potentially affect one of the following parts:
@Public(Evolving)
: noDocumentation