All Products
Search
Document Center

Simple Log Service:Kibana Dashboard migration

Last Updated:May 07, 2025

This topic explains how to migrate existing Kibana dashboards to the Kibana connected to Simple Log Service after data migration.

Solution overview

image
  1. Export the dashboard's export.ndjson file from the source Kibana.

  2. Replace the index pattern ID in export.ndjson with the new Kibana index pattern ID.

  3. Import export.ndjson to the new Kibana.

Procedure

1. Prepare data and dashboard in Elasticsearch, and export dashboard configuration

  1. Write the following data to Elasticsearch.

    POST people/_bulk
    { "index": {} }
    { "name": "Alice", "age": 30 }
    { "index": {} }
    { "name": "Bob", "age": 25 }
    { "index": {} }
    { "name": "Charlie", "age": 35 }
  2. Create two dashboards in Kibana, and respectively name them People Dashboard and People Names.

    image

  3. Create a chart in People Dashboard.

    image

  4. Click Stack Management > Saved Objects to enter Kibana. On the Saved Objects page, select the dashboard you want to export. Do not select Include related objects. The exported dashboard content is saved in export.ndjson.

    image

2. Reuse the dashboard in the Kibana connected to Simple Log Service

  1. Log on to the Simple Log Service console.

  2. In the Projects section, click the one you want to manage.

    image

  3. Write the same log data to the logstore, ensuring the fields are consistent.

    Note

    The fields in Elasticsearch and Simple Log Service must match to avoid migration errors such as missing fields in the Dashboard.

    image

  4. Deploy a new Kibana and connect it to Simple Log Service using Docker Compose or Helm. For more information, see Connect Simple Log Service to Kibana.

    Note

    The Elasticsearch and Kibana versions used here must match the previously used versions.

    The corresponding index pattern is automatically created in Kibana, as shown below:

    image

3. Execute migration

  1. Check the index pattern ID in the source Elasticsearch. The kibana_config_1.json is as follows:

    {
        "url" : "http://xxx:5601",
        "user" : "elastic",
        "password" : "",
        "space" :  "default"
    }

    Execute the following command to view the index pattern using ptn_list.py.

    ➜  python ptn_list.py kibana_config_1.json
    f06fc2b0-****-****-****-15adf26175c7    people

    Where f06fc2b0-****-****-****-15adf26175c7 is the ID of the source index pattern. You can check the references to the index pattern ID in the dashboard configuration file export.ndjson exported from Kibana.

    image

  2. Find the new index pattern ID in the Kibana connected to Simple Log Service.

    Similarly, use ptn_list.py to view the index pattern ID in the new Kibana.

    # Prepare kibana_config_2.json
    
    ➜  python ptn_list.py kibana_config_2.json
    ef710470-****-****-****-ad198b7b763d	etl.people

    Use the sed command to batch replace the ID in export.ndjson.

    sed -i 's/f06fc2b0-****-****-****-15adf26175c7/ef710470-****-****-****-ad198b7b763d/' export.ndjson
  3. Click Stack Management > Saved Objects to enter the new Kibana. On the Saved Objects page, click Import to import export.ndjson. The following figure shows a successful import.

    image

  4. Open the new dashboard to view the import results.

    image