Creates a metadata table in Data Lake Formation (DLF).

Debugging

OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer. OpenAPI Explorer dynamically generates the sample code of the operation for different SDKs.

Request headers

This operation uses only common request headers. For more information, see Common request parameters.

Request syntax

POST /api/metastore/catalogs/databases/tables 

Request parameters

ParameterTypePositionRequiredExampleDescription
ObjectBodyNo

The HTTP request body, in the JSON format.

CatalogIdStringBodyNo1344371

The catalog ID of the metadatabase. Default value: the user ID of your Alibaba Cloud account.

DatabaseNameStringBodyNodatabase_test

The name of the metadatabase.

TableInputTableInputBodyNo

The details about the metadata table.

RegionIdStringHostNocn-hangzhou

The ID of the region where DLF is activated.

Response parameters

ParameterTypeExampleDescription
CodeStringOK

The description of the status code.

MessageString.

The error message that is returned.

RequestIdStringB7F4B621-E41E-4C84-B97F-42B5380A32BB

The ID of the request.

SuccessBooleantrue

Indicates whether the call was successful.

Error codes

InvalidObject: The specified name, partition, column, or skewed field failed the verification.

AlreadyExists: The specified metadata table already exists.

NoSuchObject: The specified metadatabase does not exist.

InternalError: An internal error has occurred. Troubleshoot the error based on the error message that is returned.

Examples

Sample requests

POST /api/metastore/catalogs/databases/tables HTTP/1.1 
{
  "CatalogId": "1344371",
  "DatabaseName": "database_test",
  "TableInput": {
    "Cascade": false,
    "TableName": "test_table_20201223",
    "DatabaseName": "database_test",  
    "TableType": "MANAGED_TABLE",
    "Description": "",
    "Retention": 365,
    "Sd": {
      "Cols": [
        {
          "Comment": "user_name",
          "Name": "name",
          "Type": "string",
          "Parameters": {}
        }
      ],
      "Compressed": false,
      "InputFormat": "input",
      "Location": "",
      "NumBuckets": 5,
      "OutputFormat": "output",
      "Parameters": {},
      "SerDeInfo": {
        "Name": "",
        "SerializationLib": "",
        "Parameters": {}
      },
      "SkewedInfo": {
        "SkewedColNames": [],
        "SkewedColValues": [],
        "SkewedColValueLocationMaps": {}
      },
      "BucketCols": [
        "col1"
      ],
      "SortCols": [
        {
          "Order": 0,
          "Col": "col"
        }
      ],
      "StoredAsSubDirectories": false
    },
    "PartitionKeys": [
      {
        "Comment": "comment_day",
        "Name": "day",
        "Type": "int",
        "Parameters": {}
      }
    ],
    "Parameters": {},
    "ViewOriginalText": "",
    "ViewExpandedText": "",
    "RewriteEnabled": true,
    "Temporary": false
  }
}

Sample success responses

JSON format

{
  "Success": true,
  "Code": "OK",
  "Message": "",
  "HttpStatusCode": 200,
  "RequestId": "B7F4B621-E41E-4C84-B97F-42B5380A32BB"
}

Note: The following content describes the data formats that are supported by a metadata table.

When you create a metadata table, you must specify the data format. The following code shows you how to specify the data format:

Avro format

table.Parameters: {"classification":"avro"}

table.Sd:

"InputFormat":"org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat"

"OutputFormat":"org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat"

"SerdeInfo":{"SerializationLib":"org.apache.hadoop.hive.serde2.avro.AvroSerDe","Parameters":{"serialization.format":"1"}}

JSON format

table.Parameters:{"classification":"json"}

table.Sd:

"InputFormat":"org.apache.hadoop.mapred.TextInputFormat"

"OutputFormat":"org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat"

"SerdeInfo":{"Parameters":{"paths":","},"SerializationLib":"org.apache.hive.hcatalog.data.JsonSerDe"}

XML format

table.Parameters:{"classification":"json"}

"InputFormat":"com.ibm.spss.hive.serde2.xml.XmlInputFormat"

"OutputFormat":"org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat",

"SerdeInfo":{"Parameters":{"rowTag":""},"SerializationLib":"com.ibm.spss.hive.serde2.xml.XmlSerDe"}

Parquet format

table.Parameters:{"classification":"parquet"}

"InputFormat":"org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat"

"OutputFormat":"org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat"

"SerdeInfo":{"Parameters":{"serialization.format":"1"},"SerializationLib":"org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe"}

CSV format

table.Parameters:{"classification":"csv"}

"InputFormat":"org.apache.hadoop.mapred.TextInputFormat",

"OutputFormat":"org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat",

"SerdeInfo":{"Parameters":{"separatorChar":","},"SerializationLib":"org.apache.hadoop.hive.serde2.OpenCSVSerde"}

Note: Set the separatorChar parameter to characters, such as a comma (,).

ORC format

table.Parameters:{"classification":"orc"}

"InputFormat":"org.apache.hadoop.hive.ql.io.orc.OrcInputFormat",

"OutputFormat":"org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat",

"SerdeInfo":{"Parameters":{},"SerializationLib":"org.apache.hadoop.hive.ql.io.orc.OrcSerde"}