Skip to content

Commit

Permalink
[FLINK-3241] Fix SNAPSHOT deployment for Scala 2.11
Browse files Browse the repository at this point in the history
  • Loading branch information
rmetzger committed Jan 19, 2016
1 parent 544abb9 commit 8f0c47d
Show file tree
Hide file tree
Showing 11 changed files with 33 additions and 33 deletions.
20 changes: 10 additions & 10 deletions docs/apis/batch/examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,8 +29,8 @@ The following example programs showcase different applications of Flink
from simple word counting to graph algorithms. The code samples illustrate the
use of [Flink's API](index.html).

The full source code of the following and more examples can be found in the __flink-java-examples__
or __flink-scala-examples__ module of the Flink source repository.
The full source code of the following and more examples can be found in the __flink-examples-batch__
or __flink-examples-streaming__ module of the Flink source repository.

* This will be replaced by the TOC
{:toc}
Expand Down Expand Up @@ -99,7 +99,7 @@ public static class Tokenizer implements FlatMapFunction<String, Tuple2<String,
}
~~~

The {% gh_link /flink-examples/flink-java-examples/src/main/java/org/apache/flink/examples/java/wordcount/WordCount.java "WordCount example" %} implements the above described algorithm with input parameters: `<text input path>, <output path>`. As test data, any text file will do.
The {% gh_link /flink-examples/flink-examples-batch/src/main/java/org/apache/flink/examples/java/wordcount/WordCount.java "WordCount example" %} implements the above described algorithm with input parameters: `<text input path>, <output path>`. As test data, any text file will do.

</div>
<div data-lang="scala" markdown="1">
Expand All @@ -118,7 +118,7 @@ val counts = text.flatMap { _.toLowerCase.split("\\W+") filter { _.nonEmpty } }
counts.writeAsCsv(outputPath, "\n", " ")
~~~

The {% gh_link /flink-examples/flink-scala-examples/src/main/scala/org/apache/flink/examples/scala/wordcount/WordCount.scala "WordCount example" %} implements the above described algorithm with input parameters: `<text input path>, <output path>`. As test data, any text file will do.
The {% gh_link /flink-examples/flink-exampls-batch/src/main/scala/org/apache/flink/examples/scala/wordcount/WordCount.scala "WordCount example" %} implements the above described algorithm with input parameters: `<text input path>, <output path>`. As test data, any text file will do.


</div>
Expand Down Expand Up @@ -206,7 +206,7 @@ public static final class EpsilonFilter
}
~~~

The {% gh_link /flink-examples/flink-java-examples/src/main/java/org/apache/flink/examples/java/graph/PageRankBasic.java "PageRank program" %} implements the above example.
The {% gh_link /flink-examples/flink-examples-batch/src/main/java/org/apache/flink/examples/java/graph/PageRank.java "PageRank program" %} implements the above example.
It requires the following parameters to run: `<pages input path>, <links input path>, <output path>, <num pages>, <num iterations>`.

</div>
Expand Down Expand Up @@ -273,7 +273,7 @@ val result = finalRanks
result.writeAsCsv(outputPath, "\n", " ")
~~~

he {% gh_link /flink-examples/flink-scala-examples/src/main/scala/org/apache/flink/examples/scala/graph/PageRankBasic.scala "PageRank program" %} implements the above example.
he {% gh_link /flink-examples/flink-examples-batch/src/main/scala/org/apache/flink/examples/scala/graph/PageRankBasic.scala "PageRank program" %} implements the above example.
It requires the following parameters to run: `<pages input path>, <links input path>, <output path>, <num pages>, <num iterations>`.
</div>
</div>
Expand Down Expand Up @@ -369,7 +369,7 @@ public static final class ComponentIdFilter
}
~~~

The {% gh_link /flink-examples/flink-java-examples/src/main/java/org/apache/flink/examples/java/graph/ConnectedComponents.java "ConnectedComponents program" %} implements the above example. It requires the following parameters to run: `<vertex input path>, <edge input path>, <output path> <max num iterations>`.
The {% gh_link /flink-examples/flink-examples-batch/src/main/java/org/apache/flink/examples/java/graph/ConnectedComponents.java "ConnectedComponents program" %} implements the above example. It requires the following parameters to run: `<vertex input path>, <edge input path>, <output path> <max num iterations>`.

</div>
<div data-lang="scala" markdown="1">
Expand Down Expand Up @@ -412,7 +412,7 @@ verticesWithComponents.writeAsCsv(outputPath, "\n", " ")

~~~

The {% gh_link /flink-examples/flink-scala-examples/src/main/scala/org/apache/flink/examples/scala/graph/ConnectedComponents.scala "ConnectedComponents program" %} implements the above example. It requires the following parameters to run: `<vertex input path>, <edge input path>, <output path> <max num iterations>`.
The {% gh_link /flink-examples/flink-examples-batch/src/main/scala/org/apache/flink/examples/scala/graph/ConnectedComponents.scala "ConnectedComponents program" %} implements the above example. It requires the following parameters to run: `<vertex input path>, <edge input path>, <output path> <max num iterations>`.
</div>
</div>

Expand Down Expand Up @@ -488,13 +488,13 @@ DataSet<Tuple3<Integer, Integer, Double>> priceSums =
priceSums.writeAsCsv(outputPath);
~~~

The {% gh_link /flink-examples/flink-java-examples/src/main/java/org/apache/flink/examples/java/relational/TPCHQuery10.java "Relational Query program" %} implements the above query. It requires the following parameters to run: `<orders input path>, <lineitem input path>, <output path>`.
The {% gh_link /flink-examples/flink-examples-batch/src/main/java/org/apache/flink/examples/java/relational/TPCHQuery10.java "Relational Query program" %} implements the above query. It requires the following parameters to run: `<orders input path>, <lineitem input path>, <output path>`.

</div>
<div data-lang="scala" markdown="1">
Coming soon...

The {% gh_link /flink-examples/flink-scala-examples/src/main/scala/org/apache/flink/examples/scala/relational/TPCHQuery3.scala "Relational Query program" %} implements the above query. It requires the following parameters to run: `<orders input path>, <lineitem input path>, <output path>`.
The {% gh_link /flink-examples/flink-examples-batch/src/main/scala/org/apache/flink/examples/scala/relational/TPCHQuery3.scala "Relational Query program" %} implements the above query. It requires the following parameters to run: `<orders input path>, <lineitem input path>, <output path>`.

</div>
</div>
Expand Down
8 changes: 4 additions & 4 deletions docs/apis/batch/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -2495,7 +2495,7 @@ env.execute("Iterative Pi Example");
{% endhighlight %}

You can also check out the
{% gh_link /flink-examples/flink-java-examples/src/main/java/org/apache/flink/examples/java/clustering/KMeans.java "K-Means example" %},
{% gh_link /flink-examples/flink-examples-batch/src/main/java/org/apache/flink/examples/java/clustering/KMeans.java "K-Means example" %},
which uses a BulkIteration to cluster a set of unlabeled points.

#### Delta Iterations
Expand Down Expand Up @@ -2591,7 +2591,7 @@ env.execute("Iterative Pi Example");
{% endhighlight %}

You can also check out the
{% gh_link /flink-examples/flink-scala-examples/src/main/scala/org/apache/flink/examples/scala/clustering/KMeans.scala "K-Means example" %},
{% gh_link /flink-examples/flink-examples-batch/src/main/scala/org/apache/flink/examples/scala/clustering/KMeans.scala "K-Means example" %},
which uses a BulkIteration to cluster a set of unlabeled points.

#### Delta Iterations
Expand Down Expand Up @@ -2879,7 +2879,7 @@ data.map(new RichMapFunction<String, String>() {

Make sure that the names (`broadcastSetName` in the previous example) match when registering and
accessing broadcasted data sets. For a complete example program, have a look at
{% gh_link /flink-examples/flink-java-examples/src/main/java/org/apache/flink/examples/java/clustering/KMeans.java#L96 "K-Means Algorithm" %}.
{% gh_link /flink-examples/flink-examples-batch/src/main/java/org/apache/flink/examples/java/clustering/KMeans.java#L96 "K-Means Algorithm" %}.
</div>
<div data-lang="scala" markdown="1">

Expand All @@ -2905,7 +2905,7 @@ data.map(new RichMapFunction[String, String]() {

Make sure that the names (`broadcastSetName` in the previous example) match when registering and
accessing broadcasted data sets. For a complete example program, have a look at
{% gh_link /flink-examples/flink-scala-examples/src/main/scala/org/apache/flink/examples/scala/clustering/KMeans.scala#L96 "KMeans Algorithm" %}.
{% gh_link /flink-examples/flink-examples-batch/src/main/scala/org/apache/flink/examples/scala/clustering/KMeans.scala#L96 "KMeans Algorithm" %}.
</div>
</div>

Expand Down
2 changes: 1 addition & 1 deletion docs/apis/local_execution.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,6 @@ public static void main(String[] args) throws Exception {
}
~~~

The `flink-java-examples` module contains a full example, called `CollectionExecutionExample`.
The `flink-examples-batch` module contains a full example, called `CollectionExecutionExample`.

Please note that the execution of the collection-based Flink programs is only possible on small data, which fits into the JVM heap. The execution on collections is not multi-threaded, only one thread is used.
6 changes: 3 additions & 3 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ computations over data streams. Flink also builds batch processing on top of the
native iteration support, managed memory, and program optimization.

If you want to write your first program, look at one of the available quickstarts, and refer to the
[DataSet API guide](apis/programming_guide.html) or the [DataStream API guide](apis/streaming_guide.html).
[DataSet API guide](apis/batch/index.html) or the [DataStream API guide](apis/streaming/index.html).

## Stack

Expand All @@ -42,8 +42,8 @@ This is an overview of Flink's stack. Click on any component to go to the respec
<area shape="rect" coords="268,0,343,200" alt="Flink ML" href="libs/ml/">
<area shape="rect" coords="348,0,423,200" alt="Table" href="libs/table.html">

<area shape="rect" coords="188,205,538,260" alt="DataSet API (Java/Scala)" href="apis/programming_guide.html">
<area shape="rect" coords="543,205,893,260" alt="DataStream API (Java/Scala)" href="apis/streaming_guide.html">
<area shape="rect" coords="188,205,538,260" alt="DataSet API (Java/Scala)" href="apis/batch/index.html">
<area shape="rect" coords="543,205,893,260" alt="DataStream API (Java/Scala)" href="apis/streaming/index.html">

<!-- <area shape="rect" coords="188,275,538,330" alt="Optimizer" href="optimizer.html"> -->
<!-- <area shape="rect" coords="543,275,893,330" alt="Stream Builder" href="streambuilder.html"> -->
Expand Down
2 changes: 1 addition & 1 deletion docs/quickstart/java_api_quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -149,7 +149,7 @@ public class LineSplitter implements FlatMapFunction<String, Tuple2<String, Inte
}
~~~

{% gh_link /flink-examples/flink-java-examples/src/main/java/org/apache/flink/examples/java/wordcount/WordCount.java "Check GitHub" %} for the full example code.
{% gh_link /flink-examples/flink-examples-batch/src/main/java/org/apache/flink/examples/java/wordcount/WordCount.java "Check GitHub" %} for the full example code.

For a complete overview over our API, have a look at the [Programming Guide]({{ site.baseurl }}/apis/programming_guide.html) and [further example programs](examples.html). If you have any trouble, ask on our [Mailing List](http:https://mail-archives.apache.org/mod_mbox/flink-dev/). We are happy to provide help.

5 changes: 2 additions & 3 deletions flink-dist/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -289,9 +289,8 @@ under the License.
</filters>
<artifactSet>
<excludes>
<exclude>org.apache.flink:flink-java-examples</exclude>
<exclude>org.apache.flink:flink-scala-examples</exclude>
<exclude>org.apache.flink:flink-streaming-examples</exclude>
<exclude>org.apache.flink:flink-examples-batch</exclude>
<exclude>org.apache.flink:flink-examples-streaming</exclude>
<exclude>org.apache.flink:flink-python</exclude>
<exclude>org.slf4j:slf4j-log4j12</exclude>
<exclude>log4j:log4j</exclude>
Expand Down
4 changes: 2 additions & 2 deletions flink-java8/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ under the License.
</configuration>
</plugin>

<!-- get default data from flink-java-examples package -->
<!-- get default data from flink-examples-batch package -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
Expand All @@ -117,7 +117,7 @@ under the License.
<artifactItems>
<artifactItem>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java-examples</artifactId>
<artifactId>flink-examples-batch</artifactId>
<version>${project.version}</version>
<type>jar</type>
<overWrite>false</overWrite>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -120,9 +120,8 @@ under the License.
<exclude>org.apache.flink:flink-optimizer</exclude>
<exclude>org.apache.flink:flink-clients</exclude>
<exclude>org.apache.flink:flink-avro</exclude>
<exclude>org.apache.flink:flink-java-examples</exclude>
<exclude>org.apache.flink:flink-scala-examples</exclude>
<exclude>org.apache.flink:flink-streaming-examples</exclude>
<exclude>org.apache.flink:flink-examples-batch</exclude>
<exclude>org.apache.flink:flink-examples-streaming</exclude>
<exclude>org.apache.flink:flink-streaming-java</exclude>

<!-- Also exclude very big transitive dependencies of Flink
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -124,9 +124,8 @@ under the License.
<exclude>org.apache.flink:flink-optimizer</exclude>
<exclude>org.apache.flink:flink-clients</exclude>
<exclude>org.apache.flink:flink-avro</exclude>
<exclude>org.apache.flink:flink-java-examples</exclude>
<exclude>org.apache.flink:flink-scala-examples</exclude>
<exclude>org.apache.flink:flink-streaming-examples</exclude>
<exclude>org.apache.flink:flink-examples-batch</exclude>
<exclude>org.apache.flink:flink-examples-streaming</exclude>
<exclude>org.apache.flink:flink-streaming-java</exclude>

<!-- Also exclude very big transitive dependencies of Flink
Expand Down
7 changes: 5 additions & 2 deletions tools/change-scala-version.sh
Original file line number Diff line number Diff line change
Expand Up @@ -67,8 +67,11 @@ find "$BASEDIR" -name 'pom.xml' -not -path '*target*' -print \
-exec bash -c "sed_i 's/\(artifactId>flink.*\)'$FROM_SUFFIX'<\/artifactId>/\1'$TO_SUFFIX'<\/artifactId>/g' {}" \;

# fix for examples
find "$BASEDIR/flink-examples/flink-java-examples" -name 'pom.xml' -not -path '*target*' -print \
-exec bash -c "sed_i 's/\(<copy file=\".*flink-java-examples\)'$FROM_SUFFIX'/\1'$TO_SUFFIX'/g' {}" \;
find "$BASEDIR/flink-examples/flink-examples-batch" -name 'pom.xml' -not -path '*target*' -print \
-exec bash -c "sed_i 's/\(<copy file=\".*flink-examples-batch\)'$FROM_SUFFIX'/\1'$TO_SUFFIX'/g' {}" \;

find "$BASEDIR/flink-examples/flink-examples-streaming" -name 'pom.xml' -not -path '*target*' -print \
-exec bash -c "sed_i 's/\(<copy file=\".*flink-examples-streaming\)'$FROM_SUFFIX'/\1'$TO_SUFFIX'/g' {}" \;

# fix for quickstart
find "$BASEDIR/flink-quickstart" -name 'pom.xml' -not -path '*target*' -print \
Expand Down
2 changes: 1 addition & 1 deletion tools/deploy_to_maven.sh
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ function deploy_to_s3() {
pwd


# Check if push/commit is eligible for pushing
# Check if push/commit is eligible for deploying
echo "Job: $TRAVIS_JOB_NUMBER ; isPR: $TRAVIS_PULL_REQUEST ; repo slug : $TRAVIS_REPO_SLUG "
if [[ $TRAVIS_PULL_REQUEST == "false" ]] && [[ $TRAVIS_REPO_SLUG == "apache/flink" ]] ; then

Expand Down

0 comments on commit 8f0c47d

Please sign in to comment.