This topic describes how to create, register, and use a user-defined aggregate function (UDAF) in Realtime Compute for Apache Flink.
DefinitionA UDAF aggregates multiple values into a single value. A many-to-one mapping is established between the input and output data of a UDAF. Multiple input records are aggregated to generate one output value.
- Only Apache Flink 1.12 and later are supported.
- Python 3.7.9 is pre-installed on a fully managed Flink cluster. Therefore, you must develop code in Python 3.7.9.
- JDK 1.8 is used in the runtime environment of fully managed Flink. If your Python job depends on a third-party JAR package, make sure that the JAR package is compatible with JDK 1.8.
- Only open source Scala 2.11 is supported. If your Python job depends on a third-party JAR package, make sure that the JAR package that corresponds to Scala 2.11 is used.
Create a UDAF
- Download and decompress the python_demo-master package to your on-premises machine.
- In the main menu bar of PyCharm, choose python_demo-master package. to open the decompressed
- Double-click the udfs.py file in the \python_demo-master\udx directory. Then, modify the content of the file based on your business requirements.
In this example, weighted_avg defines the code for the weighted average of the current data and historical data.
from pyflink.common import Row from pyflink.table import AggregateFunction, DataTypes from pyflink.table.udf import udaf class WeightedAvg(AggregateFunction): def create_accumulator(self): # Row(sum, count) return Row(0, 0) def get_value(self, accumulator: Row) -> float: if accumulator == 0: return 0 else: return accumulator / accumulator def accumulate(self, accumulator: Row, value, weight): accumulator += value * weight accumulator += weight def retract(self, accumulator: Row, value, weight): accumulator -= value * weight accumulator -= weight weighted_avg = udaf(f=WeightedAvg(), result_type=DataTypes.DOUBLE(), accumulator_type=DataTypes.ROW([ DataTypes.FIELD("f0", DataTypes.BIGINT()), DataTypes.FIELD("f1", DataTypes.BIGINT())]))
- Go to the \python_demo-master directory to which the udx folder belongs and run the following command to package
the files in the directory:
zip -r python_demo.zip udx
If the python_demo.zip package appears in the \python_demo-master\ directory, the UDAF is created.
Register a UDAF
For more information about how to register a UDAF, see .
Use a UDAF
- Use Flink SQL to create a job. For more information, see Develop a job.
Obtain the value of the a field in the ASI_UDAF_Source table with the b field as the weight. Sample code:
CREATE TEMPORARY TABLE ASI_UDAF_Source ( a BIGINT, b BIGINT ) WITH ( 'connector' = 'datagen' ); CREATE TEMPORARY TABLE ASI_UDAF_Sink ( avg_value DOUBLE ) WITH ( 'connector' = 'blackhole' ); INSERT INTO ASI_UDAF_Sink SELECT weighted_avg(a, b) FROM ASI_UDAF_Source;
- On the Deployments page in the console of fully managed Flink, find the job that you want to start,
and click Start in the Actions column.
After the job is started, the average of the current data and historical data of the a field in the ASI_UDAF_Source table is inserted into each row in the ASI_UDAF_Sink table with field b as the weight.