Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FLINK-18200][python] Replace the deprecated interfaces with the new interfaces in the tests and examples #12770

Closed
wants to merge 9 commits into from
Closed

Conversation

SteNicholas
Copy link
Member

@SteNicholas SteNicholas commented Jun 25, 2020

What is the purpose of the change

Currently, there are a few deprecated interfaces which are still heavily used in the tests and examples, e.g. register_function, etc. Deprecated interfaces should be replaced to improve the tests and examples to use the new interfaces, such as create_temporary_system_function instead.

Brief change log

  • Modity tests and examples including calling method insert_into, scan, sql_update, register_function, register_java_function of TableEnvironment.
  • Modity tests and examples including calling method insert_into of Table.
  • Deprecate method from_table_source, _from_elements, connect of TableEnvironment.

Verifying this change

  • Modity unit tests including calling method insert_into, scan, sql_update, register_function, register_java_function of TableEnvironment.
  • Modity unit tests including calling method insert_into of Table.

Does this pull request potentially affect one of the following parts:

  • Dependencies (does it add or upgrade a dependency): (yes / no)
  • The public API, i.e., is any changed class annotated with @Public(Evolving): (yes / no)
  • The serializers: (yes / no / don't know)
  • The runtime per-record code paths (performance sensitive): (yes / no / don't know)
  • Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn/Mesos, ZooKeeper: (yes / no / don't know)
  • The S3 file system connector: (yes / no / don't know)

Documentation

  • Does this pull request introduce a new feature? (yes / no)
  • If yes, how is the feature documented? (not applicable / docs / JavaDocs / not documented)

@flinkbot
Copy link
Collaborator

Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
to review your pull request. We will use this comment to track the progress of the review.

Automated Checks

Last check on commit b2f1ab0 (Thu Jun 25 18:22:35 UTC 2020)

Warnings:

  • No documentation files were touched! Remember to keep the Flink docs up to date!

Mention the bot in a comment to re-run the automated checks.

Review Progress

  • ❓ 1. The [description] looks good.
  • ❓ 2. There is [consensus] that the contribution should go into to Flink.
  • ❓ 3. Needs [attention] from.
  • ❓ 4. The change fits into the overall [architecture].
  • ❓ 5. Overall code [quality] is good.

Please see the Pull Request Review Guide for a full explanation of the review process.


The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands
The @flinkbot bot supports the following commands:

  • @flinkbot approve description to approve one or more aspects (aspects: description, consensus, architecture and quality)
  • @flinkbot approve all to approve all aspects
  • @flinkbot approve-until architecture to approve everything until architecture
  • @flinkbot attention @username1 [@username2 ..] to require somebody's attention
  • @flinkbot disapprove architecture to remove an approval you gave earlier

@flinkbot
Copy link
Collaborator

flinkbot commented Jun 25, 2020

CI report:

Bot commands The @flinkbot bot supports the following commands:
  • @flinkbot run travis re-run the last Travis build
  • @flinkbot run azure re-run the last Azure build

…precated-interface

* 'master' of github.com:SteNicholas/flink: (344 commits)
  [FLINK-18883][python] Support reduce() operation for Python KeyedStream. (#13113)
  [FLINK-18848][python] Fix to_pandas to handle retraction data properly
  [FLINK-18865][doc] Update Kafka doc for setStartFromEarliest method
  [FLINK-18874][python] Support conversions between Table and DataStream. (#13107)
  [FLINK-18862][table-planner-blink] Fix LISTAGG throws BinaryRawValueData cannot be cast to StringData exception during runtime
  [FLINK-18861][python] Support add_source() for Python DataStream API. (#13095)
  [FLINK-18864][python] Support key_by() operation for Python DataStream API. (#13097)
  [FLINK-18798][docs-zh] Translate "Debugging Windows & Event Time" page of "Debugging & Monitoring" into Chinese
  [FLINK-18859][tests] Increase timeout of ExecutionGraphNotEnoughResourceTest#testRestartWithSlotSharingAndNotEnoughResources to make it more stable
  [hotfix] Skip e2e tests for aarch64
  [hotfix][table-planner-blink] FlinkStreamProgram should not use FlinkBatchRuleSets (#13100)
  [FLINK-18766][python] Support add_sink() for Python DataStream API. (#13094)
  [FLINK-18688][table-planner-blink] Fix binary row writing with incorrect order in ProjectionCodeGenerator by removing for loop optimization
  [FLINK-18847][docs][python] Add documentation about data types in Python Table API
  [FLINK-18678][hive][doc] Update doc about setting hive version
  [hotfix][docs] Fix the link-tags of 'Side Outputs' page of 'DataStream API' (#13087)
  [FLINK-18765][python] Support map() and flat_map() for Python DataStream API. (#13066)
  [FLINK-17503][runtime] [logs] Refactored log output.
  [FLINK-16510] Allow configuring shutdown behavior to avoid JVM freeze
  [FLINK-18838][python] Support JdbcCatalog in Python Table API
  ...

� Conflicts:
�	flink-python/pyflink/table/tests/test_dependency.py
�	flink-python/pyflink/table/tests/test_pandas_udf.py
�	flink-python/pyflink/table/tests/test_udf.py
�	flink-python/pyflink/table/tests/test_udtf.py
…precated-interface

* 'master' of github.com:SteNicholas/flink: (64 commits)
  [FLINK-18879][python] Support Row Serialization and Deserialization schemas for Python DataStream API. (#13150)
  [FLINK-18956][task] StreamTask.invoke should catch Throwable
  [FLINK-18220][runtime] Enrich heap space OOMs with memory configuration information
  [FLINK-18935] Reject CompletedOperationCache.registerOngoingOperation if cache is shutting down
  [FLINK-18936][docs] Update documentation around aggregate functions
  [hotfix][table] Keep aggregate functions in sync with code generation
  [FLINK-18884][python] Add chaining strategy and slot sharing group interfaces for Python DataStream API. (#13140)
  [FLINK-18901][table] Use new type inference for aggregate functions in SQL DDL
  [FLINK-18878][python] Support dependency management for Python StreamExecutionEnvironment. (#13136)
  [FLINK-18719][runtime] Introduce new ActiveResourceManager.
  [FLINK-18719][runtime] Introduce interfaces ResourceManagerDriver and ResourceEventHandler.
  [FLINK-18719][runtime] Rename ActiveResourceManager to LegacyActiveResourceManager.
  [hotfix][runtime] Add equals() and hashCode() or CommonProcessMemorySpec.
  [hotfix][runtime] Refactor ResourceManager#onTaskManagerRegistration argument type.
  [hotfix][runtime] Guard ResourceManager#startServicesOnLeadership from being overridden.
  [hotfix][runtime] Refactor resource manager termination handling.
  [hotfix][runtime] Make ResourceManager#onStart final.
  [hotfix][runtime] Move PendingWorkerCounter to a separate file.
  [hotfix][runtime] Move active resource manager related classes to separate package.
  [FLINK-18659][hive][orc] Fix streaming write for Hive 1.x Orc table
  ...
Copy link
Contributor

@dianfu dianfu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@SteNicholas Thanks for the PR. LGTM with just one minor comment.

@@ -1425,7 +1426,7 @@ def _from_elements(self, elements, schema):
:param elements: The elements to create a table from.
:return: The result :class:`~pyflink.table.Table`.
"""

warnings.warn("Deprecated in 1.11.", DeprecationWarning)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We still don't want to deprecate this method in Python.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dianfu I just refer to API of TableEnvironment, which deprecates this method. I would like to follow your comment.

@@ -67,3 +67,7 @@ def tz_convert_to_internal(s, t: DataType, local_tz):
elif is_datetime64tz_dtype(s.dtype):
return s.dt.tz_convert(local_tz).dt.tz_localize(None)
return s


def exec_insert_table(table, table_path) -> JobExecutionResult:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about moving to test_case_utils?

Copy link
Member Author

@SteNicholas SteNicholas Aug 17, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dianfu To regard exec_insert_table as test util method.

Copy link
Contributor

@dianfu dianfu Aug 17, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method is only used in test cases and so I prefer to move it to test_case_utils. I see it's also used in word_count.py, however, I suggest not use it in word_count.py. This method isn't a public interface and we'd better not exposing it to users. Use it in word_count will encourage users to use this method directly.

…precated-interface

* 'master' of github.com:SteNicholas/flink: (41 commits)
  [FLINK-18994][doc-zh] Fix typo in setup taskmanager memory page of Chinese doc translations.
  [FLINK-17427][table-planner-blink] Support SupportsPartitionPushDown in planner
  [FLINK-18814][docs-zh] Translate the 'Side Outputs' page of 'DataStream API' into Chinese
  [FLINK-18930][docs-zh] Translate "Hive Dialect" page of "Hive Integration" into Chinese
  [FLINK-15448][runtime] Add Node information to the ResourceId of TaskExecutor in Yarn
  [FLINK-15448][runtime] Add metadata to ResourceID
  [hotfix][yarn] remove incorrect javadoc
  [FLINK-18985][python][doc] Update the Sphinx doc for Python DataStream API. (#13191)
  [FLINK-18643][Azure] Build a Flink snapshot release with the nightly cron-job.
  [hotfix][docs][checkpointing] Fix typos
  [FLINK-18910][docs][python] Create the new documentation structure for Python documentation according to FLIP-133.
  [FLINK-18949][python] Support Streaming File Sink for Python DataStream API. (#13156)
  [FLINK-18752][yarn] Allow shipping single files for yarn execution
  [FLINK-18219][runtime] Removed TestRestServerEndpoint from flink-client module. Instead, use newly added test utility class in flink-runtime.
  [FLINK-18219][runtime] Added OOM-enrichment for REST calls
  [hotfix][docs] Replaced outdated 'RocksDBStateBackend.setOptions(..)' by 'RocksDBStateBackend.setRocksDBOptions(..)' and fixed typo in parameter list (PptionsFactory -> RocksDBOptionsFactory).
  [FLINK-18965][sql-client] Exclude hadoop-hdfs transitive dependency for flink-sql-client
  [FLINK-18944][python] Support JDBC connector for Python DataStream API. (#13169)
  [FLINK-18972][runtime] BulkSlotProviderImpl disables batch slot request timeout check only when its slot allocation interface is really in use
  [FLINK-18212][table-planner-blink] Fix Lookup Join failed when there is a UDF equal condition on the column of temporal table
  ...

� Conflicts:
�	flink-python/pyflink/table/tests/test_descriptor.py
�	flink-python/pyflink/testing/test_case_utils.py
@dianfu dianfu closed this in b6592fc Aug 19, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants