-
Notifications
You must be signed in to change notification settings - Fork 13.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FLINK-12602][travis] Correct the flink pom artifactId
config and s…
#8563
Conversation
Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community Automated ChecksLast check on commit 3f62ecf (Tue Aug 06 15:40:56 UTC 2019) Warnings:
Mention the bot in a comment to re-run the automated checks. Review Progress
Please see the Pull Request Review Guide for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commandsThe @flinkbot bot supports the following commands:
|
@@ -41,6 +41,14 @@ under the License. | |||
<artifactId>flink-connector-kafka-0.10_${scala.binary.version}</artifactId> | |||
<version>${project.version}</version> | |||
</dependency> | |||
<!-- Due to the scope of `provided` transitive dependencies problem, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this looks hacky; why can't we search the pom for dependencies with a scala-suffix instead? Shouldn't this be as simple as adding *:*_2.1*:*
as an artifact pattern to -Dincludes
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this PR is a more conservative approach. Do you mean that if a flink module depends on another flink module with a scala suffix, it is also considered dependent on scala? I considered this rule, but I am not sure if it was correct. Such as a module which depends on a scala-suffix module with "provided" scope, and only uses the interfaces provided by the scala-suffix module itself, regardless of the actual scala version of the runtime. May such a module exist?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you mean that if a flink module depends on another flink module with a scala suffix, it is also considered dependent on scala?
Yes. It doesn't matter whether it is provided or not.
The only exception are dependencies with a test
scope.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @WeiZhong94 @zentol, for @zentol's suggestion also make sense to me, just like we discuss offline @WeiZhong94 . and I mentioned above, we can take approach 1
for this PR.
I'll update the PR soon.
274a56b
to
0e36580
Compare
Thanks for the review @zentol @WeiZhong94 I have rebase the code and update the PR! |
huh. I just found this dependency in
This means this module does have a scala dependency, and should have a suffix. I'm wondering why the script didn't pick that up. |
…cala-free check logic. Brief change log: - remove the scala version suffix for connector-hive and queryable-state-client-java - add the scala dependencies for table-api-scala and flink-sql-connectors - correct the scala-free check logic in `verify_scala_suffixes.sh`
0e36580
to
d97fb74
Compare
rebase code(for python CI failed) |
@zentol I have added the check of Scala suffix for all groupId, I appreciate if you have time to have look at again :)! |
Merging... |
apache#8563) Brief change log: - remove the scala version suffix for connector-hive and queryable-state-client-java - add the scala dependencies for table-api-scala and flink-sql-connectors - correct the scala-free check logic in `verify_scala_suffixes.sh`
What is the purpose of the change
I find a shell issue in
verify_scala_suffixes.sh
(line 145) as follows:grep "${module}_\d\+\.\d\+</artifactId>" "{}"
This code want to find out all modules that the module's
artifactId
with ascala_binary_version
suffix.but the problem is our all
artifactId
value is in the pattern ofXXX_${scala.binary.version}
, such as:<artifactId>flink-tests_${scala.binary.version}</artifactId>
then the result always empty, so this check did not take effect.
So, we need to correct
artifactId
of some of the modules and correct the check logic for scala-free.Brief change log
verify_scala_suffixes.sh
NOTE:
We have two ways handling of the connector:
flink-streaming-java
dependency wichprovided
scope for the corresponding connectors.For approach 1, we should add check logic:
dependency:tree
should add an option:-Dincludes=org.apache.flink:_2.1::
such asorg.apache.flink:flink-streaming-java_2.11
.grep
also need to add the logic:E "org.scala-lang| org.apache.flink:[^:]+_2\.1[0-9]"
, also for testgrep --invert-match "org.apache.flink:[^:]_2\.1[0-9]:.:.*:test"
.For approach 2, we should add the dependency of
link-streaming-java
forflink-sql-connector-elasticsearch6 flink-sql-connector-kafka flink-sql-connector-kafka-0.10 flink-sql-connector-kafka-0.11 flink-sql-connector-kafka-0.9
.For now, I think the change logic in approach 1 is a bit complex(a lot of filtering processing logic). So, I prefer the approach 2, due to even we add some new module in the future we should well know whether we should add the scala-suffix for the
artifactId
, then manually add dependencies on for the scala in the pom.So, in this PR I add the dependency of
link-streaming-java
forflink-sql-connectors
.Verifying this change
This change is already covered by existing tests, such as (please describe tests).
Does this pull request potentially affect one of the following parts:
@Public(Evolving)
: (no)Documentation
queryable_state.md/queryable_state.zh.md
.