2022-03-04 |
Fully managed Flink |
New version |
Code templates can help you quickly understand the features and related syntax of
fully managed Flink and implement your own business logic. This topic describes the
usage scenarios for code templates and how to use a code template.
|
Code templates |
2022-03-04 |
Fully managed Flink |
New version |
You can use a data synchronization template to quickly generate Flink SQL job code
for data synchronization. This topic describes how to use a multi-database multi-table
synchronization template or a sharded-data merging template to synchronize data.
|
Data synchronization templates |
2022-03-04 |
Fully managed Flink |
New version |
If a job has a risk or is abnormal, you can use the job diagnosis feature to automatically
locate the possible causes of the risk or exception. Then, you can quickly restore
the job to normal based on the handling suggestions that are provided by fully managed
Flink. This topic describes how to use the job diagnosis feature.
|
Job diagnostics |
2022-03-04 |
Fully managed Flink |
New version |
On the Deployments page of the console of fully managed Flink, you can quickly view
the numbers and lists of normal jobs, abnormal jobs, and jobs that have risks to learn
the health status of the jobs in real time. This topic describes how to view the numbers
and lists of abnormal jobs and jobs that have risks.
|
View jobs that have risks and abnormal jobs |
2022-03-04 |
Fully managed Flink |
New version |
This topic describes how to view the exception logs of a job, including logs of JobManager
exceptions, logs of failed TaskManagers, and logs of the TaskManagers on which checkpoints
are created at a low speed.
|
View the exception logs of a job |
2022-03-04 |
Fully managed Flink |
New version |
If you cannot locate an issue based on the logs at the INFO level, you can change
the level of logs to DEBUG. This topic describes how to change the level of logs for
a job and provides the related limits and precautions.
|
Change the level of logs for a job |
2022-03-04 |
Fully managed Flink |
New version |
These topics provide the DDL syntax that is used to create an AnalyticDB for PostgreSQL
result table and the DDL syntax that is used to create an AnalyticDB for PostgreSQL
dimension table, describe the parameters in the WITH clause, and provide data type
mappings.
|
|
2022-03-04 |
Fully managed Flink |
New version |
These topics provide the DDL syntax that is used to create a Faker source table and
the DDL syntax that is used to create a Faker dimension table, describe the limits
and the parameters in the WITH clause, and provide sample code.
|
|
2022-03-04 |
Fully managed Flink |
New version |
The AUTO OPTIMIZE statement can be used to start a streaming optimization task to
automatically optimize tables in external data lakes. This topic describes the background
information, prerequisites, limits, precautions, basic syntax, and parameter configurations
of the AUTO OPTIMIZE statement.
|
AUTO OPTIMIZE statement |
2022-03-04 |
Fully managed Flink |
New version |
After you configure a Data Lake Formation (DLF) catalog, you can access the tables
of a DLF instance in the console of fully managed Flink. This topic describes how
to configure, view, and delete a DLF catalog in the console of fully managed Flink.
|
- |