Skip to content

Commit

Permalink
[FLINK-16101][docs-zh][table] Translate "Hive Functions" page of "Hiv…
Browse files Browse the repository at this point in the history
…e Integration" into Chinese

This closes apache#11664
  • Loading branch information
20010079 authored and wuchong committed Jun 9, 2020
1 parent 926523e commit dee868d
Showing 1 changed file with 22 additions and 24 deletions.
46 changes: 22 additions & 24 deletions docs/dev/table/hive/hive_functions.zh.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: "Hive functions"
title: "Hive 函数"
nav-parent_id: hive_tableapi
nav-pos: 3
---
Expand All @@ -22,11 +22,11 @@ specific language governing permissions and limitations
under the License.
-->

## Use Hive Built-in Functions via HiveModule
## 通过 HiveModule 使用 Hive 内置函数

The `HiveModule` provides Hive built-in functions as Flink system (built-in) functions to Flink SQL and Table API users.
Flink SQL 和 Table API 中,可以通过系统内置的 `HiveModule` 来使用 Hive 内置函数,

For detailed information, please refer to [HiveModule]({{ site.baseurl }}/dev/table/modules.html#hivemodule).
详细信息,请参考 [HiveModule]({{ site.baseurl }}/zh/dev/table/modules.html#hivemodule)

<div class="codetabs" markdown="1">
<div data-lang="Java" markdown="1">
Expand Down Expand Up @@ -58,39 +58,38 @@ modules:
</div>
</div>

* NOTE that some Hive built-in functions in older versions have [thread safety issues](https://issues.apache.org/jira/browse/HIVE-16183).
We recommend users patch their own Hive to fix them.
* 请注意旧版本的部分 Hive 内置函数存在[线程安全问题](https://issues.apache.org/jira/browse/HIVE-16183)
我们建议用户及时通过补丁修正 Hive 中的这些问题。

## Hive User Defined Functions
## Hive 用户自定义函数(User Defined Functions)

Users can use their existing Hive User Defined Functions in Flink.
在 Flink 中用户可以使用 Hive 里已经存在的 UDF 函数。

Supported UDF types include:
支持的 UDF 类型包括:

- UDF
- GenericUDF
- GenericUDTF
- UDAF
- GenericUDAFResolver2

Upon query planning and execution, Hive's UDF and GenericUDF are automatically translated into Flink's ScalarFunction,
Hive's GenericUDTF is automatically translated into Flink's TableFunction,
and Hive's UDAF and GenericUDAFResolver2 are translated into Flink's AggregateFunction.
在进行查询规划和执行时,Hive UDF 和 GenericUDF 函数会自动转换成 Flink 中的 ScalarFunction,GenericUDTF 会被自动转换成 Flink 中的
TableFunction,UDAF 和 GenericUDAFResolver2 则转换成 Flink 聚合函数(AggregateFunction).

To use a Hive User Defined Function, user have to
想要使用 Hive UDF 函数,需要如下几步:

- set a HiveCatalog backed by Hive Metastore that contains that function as current catalog of the session
- include a jar that contains that function in Flink's classpath
- use Blink planner.
- 通过 Hive Metastore 将带有 UDF 的 HiveCatalog 设置为当前会话的 catalog 后端。
- 将带有 UDF 的 jar 包放入 Flink classpath 中,并在代码中引入。
- 使用 Blink planner

## Using Hive User Defined Functions
## 使用 Hive UDF

Assuming we have the following Hive functions registered in Hive Metastore:
假设我们在 Hive Metastore 中已经注册了下面的 UDF 函数:


{% highlight java %}
/**
* Test simple udf. Registered under name 'myudf'
* 注册为 'myudf' 的简单 UDF 测试类.
*/
public class TestHiveSimpleUDF extends UDF {

Expand All @@ -104,7 +103,7 @@ public class TestHiveSimpleUDF extends UDF {
}

/**
* Test generic udf. Registered under name 'mygenericudf'
* 注册为 'mygenericudf' 的普通 UDF 测试类
*/
public class TestHiveGenericUDF extends GenericUDF {

Expand Down Expand Up @@ -137,7 +136,7 @@ public class TestHiveGenericUDF extends GenericUDF {
}

/**
* Test split udtf. Registered under name 'mygenericudtf'
* 注册为 'mygenericudtf' 的字符串分割 UDF 测试类
*/
public class TestHiveUDTF extends GenericUDTF {

Expand Down Expand Up @@ -172,7 +171,7 @@ public class TestHiveUDTF extends GenericUDTF {

{% endhighlight %}

From Hive CLI, we can see they are registered:
Hive CLI 中,可以查询到已经注册的 UDF 函数:

{% highlight bash %}
hive> show functions;
Expand All @@ -184,8 +183,7 @@ myudtf

{% endhighlight %}


Then, users can use them in SQL as:
此时,用户如果想使用这些 UDF,在 SQL 中就可以这样写:


{% highlight bash %}
Expand Down

0 comments on commit dee868d

Please sign in to comment.