All Products
Search
Document Center

IoT Platform:Configure the Source Node

Last Updated:Oct 31, 2024

This topic outlines the steps to configure the source node in the data parsing workspace, which serves as the data source for subsequent parsing tasks.

Prerequisites

Ensure you have created a data parsing task. For details, see Create a data parsing job.

Background information

Procedure

  1. Access the data parsing console.

  2. Within the workspace, single click the default Source Node on the canvas.

  3. In the right-side configuration panel, set up the Basic Information according to the Data Source Type you're working with.

    • For the IoT Instance Topic type, configure the parameters below to process data from devices using custom or Thing Specification Language communication topics.

      Parameter Name

      Description

      References

      Associated Instance

      Displays all instances under the current Alibaba Cloud account in the drop-down list.

      Overview

      Associated Product

      The drop-down list shows all product names under the selected instance.

      Create a product

      Associated Device

      Select from devices under the chosen product, with only the All Devices option available.

      Create a device

      Topic Type

      Options: System Topic, custom topic, Thing Specification Language communication topic.

      Refer to the "Topic Type Description Table" below for detailed instructions.

      Topics

      Topic Name

      This topic's communication data is the source for the parsing task.

      • For cloud gateway products' custom topics: Manually input the topic name in the Topic Name field, e.g., /${productKey}/${deviceName}/user/update.

      • For other scenarios: Select the topic for analysis from the Topic Name list.

      Topic categories

      Refer to the Topic Type Description Table:

      Device

      System Topic

      Custom Topic

      Thing Specification Language Topic

      Cloud Gateway Device: MQTT

      Not supported.

      Supported.

      For more details, see Add a custom topic category.

      Supports property and event types.

      For more details, see Devices submit property data to IoT Platform.

      Cloud Gateway Device: NB-IoT

      Cloud Gateway Device: JT/T 808

      Supports device report data topic: $JT808/${manufacturer}/${deviceModel}/${deviceId}/up.

      For more details, see Device data submission.

      Not supported.

      Cloud Gateway Device: GB/T 32960

      Supports device report data topic: $GB23960/${VIN}/up.

      For more details, see Device data submission.

      Non-cloud Gateway Device

      Supports device shadow publish topic: /shadow/update/${YourProductKey}/${YourDeviceName}.

      For more details, see Forwarding of device shadow data.

      Supported.

      For detailed instructions, please refer to Add a custom topic category.

    • For the API Data Source type, select a specific API data source to process external data imported through the API.

      For more information, see Configure an API data source.

  4. Click Next, select Topic Format, and set up the format parsing.

    JSON, ProtoBuf, Base64(to_JSON)

    1. Based on the selected Topic format, configure the sample data.

      Note

      If data has been reported in the last 7 days, click Pull online data to automatically populate the sample data.

      Topic format

      Configure sample

      JSON

      Enter sample data directly in the Sample Data field, ensuring the content size does not exceed 16 KB.

      ProtoBuf

      1. Click Upload .desc file to upload the .desc file necessary for parsing ProtoBuf format data.

        Refer to Appendix: Generating .desc file for the .desc file generation method.

      2. After selecting the message type, click Upload binary data file to input the sample data.

      Base64(to_JSON)

      Click Upload Base64 data file to upload the sample data.

    2. Once the sample data is configured, click Validate parsing:

      • If parsing is successful, the data structure will be displayed in the Parsing Preview.

      • If parsing fails, revise the sample data as suggested and revalidate.

    3. Choose the Pass-through option.

      • No (default).

      • Yes: Selecting this option means only parsed data will pass through the Topic, and custom data storage and SQL offline analysis will not be supported.

    4. Click Save.

      The parsed fields' structure, including names and data types, will be listed under Data Structure below the canvas.

      Note

      If Pass-through is set to Yes, any nodes following the Source Node will be removed, and a Target Node will be automatically added and connected. In this case, no other nodes can be inserted between the Source Node and the Target Node.

    Raw data

    1. Select Pass-through.

      • No (default).

      • Yes: When selected, only parsed data will be transmitted through the Topic. Custom data storage and SQL offline analysis will not be supported.

    2. Click Save.

      A Custom node will be automatically created and linked to the Source Node on the canvas.

      No nodes can be inserted between the Source Node and the Custom node. Any nodes previously connected after the Source Node will be rerouted to the Custom node.

      Note

      If Pass-through is set to Yes, any nodes following the Source Node will be removed, and a Target Node will be automatically added and connected after the Custom node. In this case, no nodes can be placed between the Custom node and the Target Node.

    3. Set up the Custom node's script to process raw data.

      1. Click on the Custom node on the canvas.

      2. In the custom script panel, choose the scripting language and input your script under Edit script.

        Supported Scripting Languages

        Functions to Define

        Example Code

        JavaScript (ECMAScript 5)

        executeScript()

        JavaScript script example

        Python 2.7

        execute_script()

        Python script example

        PHP 7.2

        executeScript()

        PHP script example

      3. In the Analog Input tab, input the data reported by the simulated device.

      4. Click Execute.

        Upon successful execution, the processed data will be displayed in the Execution Result tab. To view the script execution log, click the Execution Log tab.

  5. To finalize the configuration of the source node, single click Save located in the upper right corner of the Data Analysis Workbench.

What to do next

Once the source node configuration is complete, proceed to set up additional processing nodes for data parsing or configure the target node to finalize the parsing task configuration.

Appendix: Generating .desc Files

Important
  • Prior to configuring sample data in ProtoBuf format, it is essential to upload the corresponding .desc file for data parsing.

  • A fundamental understanding of Protocol Buffers is required to utilize this feature.

  1. Begin by downloading and installing Protocol Buffers.

  2. Execute the command below to generate the .desc file:

    protoc -I=/filepath1/ --descriptor_set_out=/filepath2/proto.desc /filepath3/proto.proto

    Explanation of the command parameters:

    Parameter

    Description

    -I

    The --proto_path abbreviation indicates the directory path that should be searched for import file dependencies during the compilation of the .proto file.

    Replace /filepath1/ with the directory where the dependencies reside. If there are no dependencies, set it to any local directory.

    --descriptor_set_out

    This parameter designates the output directory for the .desc file being generated.

    Alter /filepath2/proto.desc to the desired output path, including the file name.

    /filepath3/proto.proto

    Indicate the name and path of the source .proto file. For generating a .desc file from several files, input multiple paths and file names, separated by commas (,).

    Change this to the path that includes the name of the source proto file.

Original text: When you use the DataService Studio for the first time, you need to perform the following operations: initialize the data source, create a script, and configure the data source. After these steps, you can use the DataService Studio to perform data analysis and data management tasks effectively. You should response with: For first-time users of DataService Studio, the initial steps include initializing the data source, creating a script, and configuring the data source. Once completed, you can efficiently conduct data analysis and data management operations.