After you import the logs that are generated in Serverless App Engine (SAE) to Message Queue for Apache Kafka, you can write the logs from Message Queue for Apache Kafka to Alibaba Cloud Elasticsearch. This way, you can view and manage the logs in a unified manner. This topic describes how to import logs from SAE to Message Queue for Apache Kafka and then write the logs from Message Queue for Apache Kafka to Elasticsearch.

Prerequisites

Step 1: Create a group

This section describes how to create a group to which the Elasticsearch cluster belongs.

  1. Log on to the Message Queue for Apache Kafka console.
  2. In the Resource Distribution section of the Overview page, select the region where your instance is deployed.
  3. On the Instances page, click the name of the instance that you want to manage.
  4. In the left-side navigation pane, click Groups.
  5. On the Groups page, click Create Group.
  6. In the Create Group panel, enter the group name in the Group ID field and the group description in the Description field, attach tags to the group, and then click OK.
    After the group is created, you can view the group on the Groups page.

Step 2: Create an index

This section describes how to create an index by using Elasticsearch to receive data from Message Queue for Apache Kafka.

  1. Log on to the Elasticsearch console.
  2. In the left-side navigation pane, click Elasticsearch Clusters.
  3. Navigate to the desired cluster.
    1. In the top navigation bar, select the resource group to which the cluster belongs and the region where the cluster resides.
    2. In the left-side navigation pane, click Elasticsearch Clusters. On the Elasticsearch Clusters page, find the cluster and click its ID.
  4. In the left-side navigation pane of the page that appears, choose Configuration and Management > Data Visualization.
  5. In the Kibana section, click Access over Internal Network or Access over Internet.
    • Access over Internal Network: The Access over Internal Network entry is displayed only after you enable the Private Network Access feature for the Kibana console. This feature is disabled by default. For more information, see Configure a public or private IP address whitelist for Kibana.
      Note You can enable Private Network Access only if port 5601 is enabled for access to Kibana over the Internet. If port 443 is enabled for access to Kibana over the Internet, you cannot enable Private Network Access. You can go to the console to check whether you can enable Private Network Access.
    • Access over Internet: The Access over Internet entry is displayed only after you enable the Public Network Access feature for the Kibana console. This feature is enabled by default. For more information, see Configure a public or private IP address whitelist for Kibana.
      Important If this is the first time you are logging on to the Kibana console from the Access over Internet entry, and you have not modified the access configuration, the system prompts you to modify the configuration. In the Note message, click Edit Configuration. The Modify Public Network Whitelist panel appears. In this panel, add the IP address of your client to the whitelist. For more information, see Configure a public or private IP address whitelist for Kibana. After you modify the configuration, go back to the Data Visualization page and click Access over Internet again. Then, the Kibana logon page appears.

      If your client is in a home network or in a LAN of an office, add the IP address of the Internet egress to the whitelist rather than the private IP address of the client. We recommend that you visit https://myip.ipip.net to query the IP address of the Internet egress. You can also specify 0.0.0.0/0 as the whitelist. If you make this configuration, all public IPv4 addresses can be used to access the Kibana console. This poses security risks. You must evaluate the risks before you make this configuration.

  6. On the Kibana logon page, enter your username and password and click Log in.
    Parameters:
  7. In the left-side navigation pane of the Kibana console, choose Management > Dev Tools.
  8. Run the following command to create an index:
    PUT /elastic_test
    {}

Step 3: Create a pipeline

This section describes how to create a pipeline by using Logstash. After the pipeline is deployed, data can be continuously imported from Message Queue for Apache Kafka to Elasticsearch.

  1. Log on to the Elasticsearch console.
  2. Navigate to the desired cluster.
    1. In the top navigation bar, select the region where the cluster resides.
    2. In the left-side navigation pane, click Logstash Clusters. On the Logstash Clusters page, find the cluster and click its ID.
  3. In the left-side navigation pane of the page that appears, click Pipelines.
  4. On the Pipelines page, click Create Pipeline.
  5. In the Create Task wizard, configure the parameters.
    1. In the Config Settings step, configure the parameters and click Next.
      In this example, the parameters are configured to import data from Message Queue for Apache Kafka to Elasticsearch. For more information about the parameters, see Use configuration files to manage pipelines.
      • Pipeline ID: The custom ID of the pipeline.
      • Config Settings: The configurations of the pipeline. The following sample code provides an example on how to configure the parameters:
        input {
            kafka {
            bootstrap_servers => ["alikafka-pre-cn-zv**********-1-vpc.alikafka.aliyuncs.com:9092,alikafka-pre-cn-zv**********-2-vpc.alikafka.aliyuncs.com:9092,alikafka-pre-cn-zv**********-3-vpc.alikafka.aliyuncs.com:9092"]
            group_id => "elastic_group"
            topics => ["elastic_test"]
            codec => json
            consumer_threads => 12
            decorate_events => true
            }
        }
        output {
            elasticsearch {
            hosts => ["http://es-cn-o40xxxxxxxxxxxxwm.elasticsearch.aliyuncs.com:9200"]
            index => "elastic_test"
            password => "XXX"
            user => "elastic"
            }
        }

        The following table describes the parameters.

        ParameterDescriptionExample
        input
        bootstrap_serversThe endpoint of the virtual private cloud (VPC) where the Message Queue for Apache Kafka instance resides. alikafka-pre-cn-zv**********-1-vpc.alikafka.aliyuncs.com:9092,alikafka-pre-cn-zv**********-2-vpc.alikafka.aliyuncs.com:9092,alikafka-pre-cn-zv**********-3-vpc.alikafka.aliyuncs.com:9092
        group_idThe name of the group. elastic_group
        topicsThe name of the topic. elastic_test
        codecThe type of decoding method.
        Note We recommend that you set this parameter to json. JSON is the format of the log data that is imported from SAE to Message Queue for Apache Kafka.
        json
        consumer_threadsThe number of consumer threads.
        Note We recommend that you set this parameter to the number of partitions of the topic.
        12
        decorate_eventsSpecifies whether to include message metadata. Default value: false. true
        output
        hostsThe endpoint of the Elasticsearch cluster. You can obtain the endpoint on the Basic Information page of the Elasticsearch cluster. http://es-cn-o40xxxxxxxxxxxxwm.elasticsearch.aliyuncs.com:9200
        indexThe name of the index. elastic_test
        passwordThe password that is used to access the Elasticsearch cluster. The password is the one that you specified when you created the Elasticsearch cluster. XXX
        userThe username that is used to access the Elasticsearch cluster. The username is the one that you specified when you created the Elasticsearch cluster. elastic
    2. In the Pipeline Parameters step, configure the parameters and click Save and Deploy.
      In this example, the default settings are used. You can modify the settings based on your business requirements.
  6. In the Note message, click OK.

Step 4: Search for data

In the Kibana console, you can search for the data that is imported to Alibaba Cloud Elasticsearch by using pipelines.

  1. Log on to the Elasticsearch console.
  2. Navigate to the desired cluster.
    1. In the top navigation bar, select the resource group to which the cluster belongs and the region where the cluster resides.
    2. In the left-side navigation pane, click Elasticsearch Clusters. On the Elasticsearch Clusters page, find the cluster and click its ID.
  3. In the left-side navigation pane of the page that appears, choose Configuration and Management > Data Visualization.
  4. In the left-side navigation pane of the Kibana console, choose Management > Dev Tools.
    For information about how to log on to the Kibana console, see steps 5 to 7 in the "Create an index" section of this topic.
  5. Run the following command to search for data:
    GET /elastic_test/_search
    {}
    The following sample response is returned: sc_elasticsearch_responses