Creates a metadata table in Data Lake Formation (DLF).

Debugging

OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer. OpenAPI Explorer dynamically generates the sample code of the operation for different SDKs.

Request headers

This operation uses only common request headers. For more information, see Common request parameters.

Request syntax

POST /api/metastore/catalogs/databases/tables 

Request parameters

Parameter Type Position Required Example Description
Object Body No

The HTTP request body, in the JSON format.

CatalogId String Body No 1344371

The catalog ID of the metadatabase. Default value: the user ID of your Alibaba Cloud account.

DatabaseName String Body No database_test

The name of the metadatabase.

TableInput TableInput Body No

The details about the metadata table.

RegionId String Host No cn-hangzhou

The ID of the region where DLF is activated.

Response parameters

Parameter Type Example Description
Code String OK

The description of the status code.

Message String .

The error message that is returned.

RequestId String B7F4B621-E41E-4C84-B97F-42B5380A32BB

The ID of the request.

Success Boolean true

Indicates whether the call was successful.

Error codes

InvalidObject: The specified name, partition, column, or skewed field failed the verification.

AlreadyExists: The specified metadata table already exists.

NoSuchObject: The specified metadatabase does not exist.

InternalError: An internal error has occurred. Troubleshoot the error based on the error message that is returned.

Examples

Sample requests

POST /api/metastore/catalogs/databases/tables HTTP/1.1 
{
  "CatalogId": "1344371",
  "DatabaseName": "database_test",
  "TableInput": {
    "Cascade": false,
    "TableName": "test_table_20201223",
    "DatabaseName": "database_test",  
    "TableType": "MANAGED_TABLE",
    "Description": "",
    "Retention": 365,
    "Sd": {
      "Cols": [
        {
          "Comment": "user_name",
          "Name": "name",
          "Type": "string",
          "Parameters": {}
        }
      ],
      "Compressed": false,
      "InputFormat": "input",
      "Location": "",
      "NumBuckets": 5,
      "OutputFormat": "output",
      "Parameters": {},
      "SerDeInfo": {
        "Name": "",
        "SerializationLib": "",
        "Parameters": {}
      },
      "SkewedInfo": {
        "SkewedColNames": [],
        "SkewedColValues": [],
        "SkewedColValueLocationMaps": {}
      },
      "BucketCols": [
        "col1"
      ],
      "SortCols": [
        {
          "Order": 0,
          "Col": "col"
        }
      ],
      "StoredAsSubDirectories": false
    },
    "PartitionKeys": [
      {
        "Comment": "comment_day",
        "Name": "day",
        "Type": "int",
        "Parameters": {}
      }
    ],
    "Parameters": {},
    "ViewOriginalText": "",
    "ViewExpandedText": "",
    "RewriteEnabled": true,
    "Temporary": false
  }
}

Sample success responses

JSON format

{
  "Success": true,
  "Code": "OK",
  "Message": "",
  "HttpStatusCode": 200,
  "RequestId": "B7F4B621-E41E-4C84-B97F-42B5380A32BB"
}

Error codes

访问错误中心查看更多错误码。

For a list of error codes, visit the API Error Center.

Note: The following content describes the data formats that are supported by a metadata table.

When you create a metadata table, you must specify the data format. The following code shows you how to specify the data format:

Avro format

table.Parameters: {"classification":"avro"}

table.Sd:

"InputFormat":"org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat"

"OutputFormat":"org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat"

"SerdeInfo":{"SerializationLib":"org.apache.hadoop.hive.serde2.avro.AvroSerDe","Parameters":{"serialization.format":"1"}}

JSON format

table.Parameters:{"classification":"json"}

table.Sd:

"InputFormat":"org.apache.hadoop.mapred.TextInputFormat"

"OutputFormat":"org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat"

"SerdeInfo":{"Parameters":{"paths":","},"SerializationLib":"org.apache.hive.hcatalog.data.JsonSerDe"}

XML format

table.Parameters:{"classification":"json"}

"InputFormat":"com.ibm.spss.hive.serde2.xml.XmlInputFormat"

"OutputFormat":"org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat",

"SerdeInfo":{"Parameters":{"rowTag":""},"SerializationLib":"com.ibm.spss.hive.serde2.xml.XmlSerDe"}

Parquet format

table.Parameters:{"classification":"parquet"}

"InputFormat":"org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat"

"OutputFormat":"org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat"

"SerdeInfo":{"Parameters":{"serialization.format":"1"},"SerializationLib":"org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe"}

CSV format

table.Parameters:{"classification":"csv"}

"InputFormat":"org.apache.hadoop.mapred.TextInputFormat",

"OutputFormat":"org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat",

"SerdeInfo":{"Parameters":{"separatorChar":","},"SerializationLib":"org.apache.hadoop.hive.serde2.OpenCSVSerde"}

Note: Set the separatorChar parameter to characters, such as a comma (,).

ORC format

table.Parameters:{"classification":"orc"}

"InputFormat":"org.apache.hadoop.hive.ql.io.orc.OrcInputFormat",

"OutputFormat":"org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat",

"SerdeInfo":{"Parameters":{},"SerializationLib":"org.apache.hadoop.hive.ql.io.orc.OrcSerde"}