This topic provides answers to some frequently asked questions about batch synchronization.
Overview
When you view the questions described in this topic, you can search for common similar questions by keyword and view the solutions to the questions.
This topic provides answers to some frequently asked questions about batch synchronization.
When you view the questions described in this topic, you can search for common similar questions by keyword and view the solutions to the questions.
Communications link failure
is returned when I run a batch synchronization task to read data from or write data to a MySQL data source?[TASK_MAX_SLOT_EXCEED]:Unable to find a gateway that meets resource requirements. 20 slots are requested, but the maximum is 16 slots
is returned when I run a batch synchronization task to synchronize data?OutOfMemoryError: Java heap space
is returned when I run a batch synchronization task to synchronize data?Duplicate entry 'xxx' for key 'uk_uk_op'
is returned when I run a batch synchronization task to synchronize data?Task have SSRF attacks
is returned? The download session is expired
is returned when I run a batch synchronization task to read data from a MaxCompute table?Error writing request body to server
is returned when I run a batch synchronization task to write data to a MaxCompute table?Application was streaming results when the connection failed. Consider raising value of 'net_write_timeout/net_read_timeout' on the server
is returned when I run a batch synchronization task to read data from or write data to ApsaraDB RDS for MySQL?[DBUtilErrorCode-05]ErrorMessage: Code:[DBUtilErrorCode-05]Description:[Failed to write data to the specified table.]. - com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: No operations allowed after connection closed
is returned when I run a batch synchronization task to synchronize data to a MySQL data source?The last packet successfully received from the server was 902,138 milliseconds ago
is returned when I run a batch synchronization task to read data from a MySQL data source?org.postgresql.util.PSQLException: FATAL: terminating connection due to conflict with recovery
is returned when I run a batch synchronization task to synchronize data from PostgreSQL?Host is blocked
is returned when I run a batch synchronization task to synchronize data from an Amazon RDS data source?no master
is returned when I run a batch synchronization task to synchronize data from a MongoDB data source?MongoExecutionTimeoutException: operation exceeded time limit
is returned when I run a batch synchronization task to synchronize data from a MongoDB data source?DataXException: operation exceeded time limit
is returned when I run a batch synchronization task to synchronize data from a MongoDB data source?Code:[RedisWriter-04], Description:[Dirty data]. - source column number is in valid!
is returned for storing data written to Redis in hash mode?AccessDenied The bucket you access does not belong to you
is returned when I run a batch synchronization task to synchronize data from OSS?Could not get block locations
is returned when I run a batch synchronization task to synchronize data to an on-premises Hive data source?string "[1,2,3,4,5]"
from a data source to an Elasticsearch data source as an array? ERROR ESReaderUtil - ES_MISSING_DATE_FORMAT, Unknown date value. please add "dataFormat". sample value:
is returned when I run a batch synchronization task to synchronize data from an Elasticsearch data source?com.alibaba.datax.common.exception.DataXException: Code:[Common-00]
is returned when I run a batch synchronization task to synchronize data from an Elasticsearch data source?version_conflict_engine_exception
is returned when I run a batch synchronization task to synchronize data to an Elasticsearch data source?illegal_argument_exception
is returned when I run a batch synchronization task to synchronize data to an Elasticsearch data source?dense_vector
is returned when I run a batch synchronization task to synchronize data from fields of an array data type in a MaxCompute data source to an Elasticsearch data source?cleanup=true
setting configured. Why does this happen?JSON data returned based on the path:[] condition is not of the ARRAY type
when I use RestAPI Writer to write data?_tags
and is_timeseries_tag
in the configurations of a time series model to read or write data? plugin xx does not specify column
is returned when I run a batch synchronization task to synchronize data?