This topic describes the limits on the Wide Column model.
Tables
Item | Limit | Description |
Name of a data table | Unique within an instance | Naming conventions for data tables:
|
Number of reserved read capacity units (CUs) and reserved write CUs of a single table | 0~100000 CU | If your business requirements are not met due to the limit, submit a ticket. |
Number of predefined columns | 0~32 | Predefined columns are non-primary key columns whose names and types are defined when a data table is created. Predefined columns can be used as the fields of a secondary index that is created for the data table. If your business requirements are not met due to the limit, submit a ticket. |
Columns
Item | Limit | Description |
Column name | Unique within a table | Naming conventions for a column in a data table:
|
Number of primary key columns | 1~4 | The first primary key column is the partition key. |
Data types of primary key columns | String, Integer, and Binary |
|
Number of attribute columns | None | Tablestore does not impose a limit on the number of attribute columns. If you attempt to read a row in which an excessive number of attribute columns are included, such as hundreds of thousands of attribute columns, you may fail to read the row due to timeout. In this case, you must specify the attribute columns that you want to read or read data by page. We recommend that you specify up to 10,000 attribute columns. |
Data types of attribute columns | String, Integer, Double, Boolean, and Binary |
|
Rows
Item | Limit | Description |
Number of attribute columns in a row | None | None |
Size of data in a row | None | Tablestore does not impose limits on the total size of column names and column values in a row. |
Operations
Item | Limit | Description |
Size of data written by using a PutRow request | 4 MB | None |
Size of data updated by using an UpdateRow request | 4 MB | None |
Number of rows written by using a BatchWriteRow request | 200 | None |
Size of data written by using a BatchWriteRow request | 4 MB | None |
Number of rows read by using a BatchGetRow request | 100 | None |
Size of data scanned by using a GetRange request | 5,000 rows or 4 MB | The size of data returned at the same time by using a GetRange request cannot exceed 5,000 rows or 4 MB. When one of the limits is exceeded, the rows of data returned are truncated by row, and the primary key information of the next row of data is returned. |
Maximum number of attribute columns written by using a request | 1,024 | When you call the PutRow, UpdateRow, or BatchWriteRow operation, you can write up to 1,024 attribute columns in a single row. |
Number of columns specified by using the columns_to_get parameter in a read request | 0 to 128 | When you initiate a read request to read a single row of data, up to 128 columns of the row can be returned. |
Data size of an HTTP request body | 5 MB | None |
Number of filters in a read request | 10 | None |
Queries per second (QPS) on the metadata of tables | 10 | The QPS on the metadata of tables in an instance cannot exceed 10. For more information about specific operations on the metadata of tables, see Operations on data tables. |
Number of UpdateTable operations on a single table | None | The limit on the number of UpdateTable operations on a single table varies based on the limit on the frequency of adjustment for a single table. |
Frequency of calling the UpdateTable operation on a single table | Once every 2 minutes | The reserved read or write throughput for a single table can be changed once every 2 minutes at most. |