Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FLINK-7244] Add parquet table source #8064

Closed

Conversation

HuangZhenQiu
Copy link
Contributor

What is the purpose of the change

Add projectable and filterable ParquetTableSource based on ParquetInputFormat

Brief change log

  1. Add Parquet table source with expression conversion to Parquet Filters
  2. Add unit test and integration test to test the correctness of Fields Projection and Predicate Push

Verifying this change

  • Added unit test for public functions of ParquetTableSource
  • Added integration tests for end-to-end test ParquetTableSource in batch mode

Does this pull request potentially affect one of the following parts:

  • Dependencies (does it add or upgrade a dependency): (no)
  • The public API, i.e., is any changed class annotated with @Public(Evolving): (no)
  • The serializers: (no)
  • The runtime per-record code paths (performance sensitive): (no)
  • Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
  • The S3 file system connector: (no)

Documentation

  • Does this pull request introduce a new feature? (no)
  • If yes, how is the feature documented? (not applicable)

@flinkbot
Copy link
Collaborator

flinkbot commented Mar 27, 2019

Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
to review your pull request. We will use this comment to track the progress of the review.

Automated Checks

Last check on commit b793cb3 (Fri Aug 23 10:13:21 UTC 2019)

Warnings:

  • 1 pom.xml files were touched: Check for build and licensing issues.
  • No documentation files were touched! Remember to keep the Flink docs up to date!

Mention the bot in a comment to re-run the automated checks.

Review Progress

  • ❓ 1. The [description] looks good.
  • ❓ 2. There is [consensus] that the contribution should go into to Flink.
  • ❓ 3. Needs [attention] from.
  • ❓ 4. The change fits into the overall [architecture].
  • ❓ 5. Overall code [quality] is good.

Please see the Pull Request Review Guide for a full explanation of the review process.


The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands
The @flinkbot bot supports the following commands:

  • @flinkbot approve description to approve one or more aspects (aspects: description, consensus, architecture and quality)
  • @flinkbot approve all to approve all aspects
  • @flinkbot approve-until architecture to approve everything until architecture
  • @flinkbot attention @username1 [@username2 ..] to require somebody's attention
  • @flinkbot disapprove architecture to remove an approval you gave earlier

Copy link
Contributor

@fhueske fhueske left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @HuangZhenQiu,
Thanks for the PR!
I've added a first round of comments but didn't have a look at the tests yet.
Will do that in the next days.

Thanks, Fabian

// the configuration to read the file
private final Configuration parquetConfig;

// type information of the data returned by the InputFormat
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

indent with tab

private final RowTypeInfo typeInfo;
// list of selected Parquet fields to return
private final int[] selectedFields;
// list of predicates to apply
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

change to predicate expressions to apply since it's not really a list?

}
}
} else if (exp instanceof BinaryExpression) {
FilterPredicate c1 = toParquetPredicate(((Or) exp).left());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

exp was only checked to be a BinaryExpression so it could also be an And. However, we know that we receive the predicates in CNF, so there should be no Ands. I think we can check for Or instead of And before.

} else {
if (exp instanceof Or) {
return FilterApi.or(c1, c2);
} else if (exp instanceof And) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there should be no Ands because we get the predicates in CNF

@HuangZhenQiu HuangZhenQiu force-pushed the add-parquet-table-source branch 3 times, most recently from 518ed2a to e49b8c7 Compare April 23, 2019 06:01
@HuangZhenQiu
Copy link
Contributor Author

@fhueske
I just resolved your first around of comments. Please add have another look when you have time.

@fhueske
Copy link
Contributor

fhueske commented Apr 29, 2019

Sorry, I was out for a few days vacation. Will continue with my review soon.
Thanks, Fabian

@HuangZhenQiu HuangZhenQiu force-pushed the add-parquet-table-source branch 6 times, most recently from e72ed37 to ff12d2d Compare June 1, 2019 19:46
Copy link
Contributor

@fhueske fhueske left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your patience and the update @HuangZhenQiu!
I left a couple of comments.

Best, Fabian

flink-formats/flink-parquet/pom.xml Show resolved Hide resolved

@Override
public boolean isFilterPushedDown() {
return predicate != null;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can set a flag when applyPredicate() was called and return it here. This avoids multiple calls of applyPredicate() if no predicate can be successfully converted.

} else if (exp instanceof BinaryExpression) {
if (exp instanceof And) {
LOG.debug("All of the predicates should be in CNF. Found an AND expression.", exp);
} else if (exp instanceof Or){
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add space between ){

if (c1 == null || c2 == null) {
return null;
} else {
if (exp instanceof Or) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this condition can be removed. We already checked that exp instanceof Or

} else {
if (exp instanceof Or) {
return FilterApi.or(c1, c2);
} else {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This else branch should be moved one condition outside.


// ensure table schema is the same
assertEquals(parquetTableSource.getTableSchema(), projected.getTableSchema());

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check that the source description is different (!parquetTableSource.explainSource.equals(projected.explainSource()))


@Override
public String explainSource() {
return "ParquetFile[path=" + path + ", schema=" + parquetSchema + ", filter=" + predicateString() + "]";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Include the projected fields in the source description. Flink checks whether the project pushdown worked by comparing the descriptions. If both are the same, the rule is not applied.


private ParquetTableSource createNestedTestParquetTableSource() throws Exception {
Tuple3<Class<? extends SpecificRecord>, SpecificRecord, Row> nested = TestUtil.getNestedRecordTestData();
Path path = TestUtil.createTempParquetFile(tempRoot.getRoot(), TestUtil.NESTED_SCHEMA,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to create temp files here.


// ensure table schema is identical
assertEquals(parquetTableSource.getTableSchema(), projected.getTableSchema());

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check that the pushed down predicates were removed from exps.

@HuangZhenQiu HuangZhenQiu force-pushed the add-parquet-table-source branch 3 times, most recently from bad6b89 to e87a4d6 Compare June 23, 2019 22:47
@HuangZhenQiu
Copy link
Contributor Author

@fhueske I resolve all of the comments, except the last one. Not sure why "the pushed down predicates should be removed from exps.

@fhueske
Copy link
Contributor

fhueske commented Jul 10, 2019

Hi @HuangZhenQiu, the predicates should be removed to let the optimizer know that these are handled by the table source. If they are not removed, the optimizer will generate code to evaluate the predicates which was already done by the table source.

I'll check the PR now.

Copy link
Contributor

@fhueske fhueske left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the update @HuangZhenQiu!

The PR looks very good.
There are a few spots where we should respect the new code style guide, but I'll fix those and merge the PR.

Cheers, Fabian

// type information of the data returned by the InputFormat
private final RowTypeInfo typeInfo;
// list of selected Parquet fields to return
private final int[] selectedFields;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add @Nullable annotation as required by the new code style guidelines (apache/flink-web#224)

// list of selected Parquet fields to return
private final int[] selectedFields;
// predicate expression to apply
private final FilterPredicate predicate;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add @Nullable annotation

}

private ParquetTableSource(String path, MessageType parquetSchema, Configuration configuration,
boolean recursiveEnumeration, int[] selectedFields, FilterPredicate predicate) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add @Nullable annotation to selectedFields and predicate

/**
* Converts flink Expression to parquet FilterPredicate.
*/
private FilterPredicate toParquetPredicate(Expression exp) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add @Nullable annotation

);

// ensure ParquetInputFormat is configured with selected fields
ParquetTableSource spyPTS = spy(projected);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

our new code style guidelines discourage the use of Mockito.

* </pre>
*/
public class ParquetTableSource
implements BatchTableSource<Row>, FilterableTableSource<Row>, ProjectableTableSource<Row> {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new Blink query optimizer and execution is based on the InputFormatTableSource abstract class. We can simply implement both, BatchTableSource and InputFormatTableSource to support both of Flink's SQL engines.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, seems that we would need to add a dependency on flink-streaming for this. I'll keep it as it is for now.

fhueske pushed a commit to fhueske/flink that referenced this pull request Jul 10, 2019
@flinkbot
Copy link
Collaborator

CI report for commit b793cb3: SUCCESS Build

@asfgit asfgit closed this in 38e5e81 Jul 10, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants