ksqlDB is a streaming SQL engine for Apache Kafka. It simplifies stream processing by providing an easy-to-use GUI on which you can execute SQL statements. This allows you to process the data of Apache Kafka and perform SQL queries on streaming data without being interrupted. ksqlDB supports a wide range of operations on stream processing, such as aggregation, connection, operations in windows, and operations in sessions.
Architectures
The following figures describe the architecture of traditional stream processing and the architecture of ksqlDB-based stream processing. In the latter architecture, the stream processing engine and connectors are integrated into ksqlDB. ksqlDB also provides materialized views for you to perform SQL queries during stream processing. For more information, see ksqlDB for Confluent Platform.
Architecture of traditional stream processing applications

Architecture of ksqlDB-based stream processing applications

Use ksqlDB
Create and configure a topic
Create a topic. In this topic, a topic named
ksql_testis created.Create a schema. Select Avro as the validation mode and add the following validation rule.
{ "namespace": "io.confluent.examples.clients.basicavro", "type": "record", "name": "Payment", "fields": [ { "name": "id", "type": "string" }, { "name": "amount", "type": "double" } ] }Enable schema validation for the
ksql_testtopic.
Authorization
ApsaraMQ for Confluent allows you to manage ksqlDB clusters using role-based access control (RBAC) authorization. In this topic, a user named test is created.
Create the
testuser and grant the following permissions to the user. For more information, see Manage users and grant permissions to them.Username
Cluster type
Resource type
Role
test
Kafka cluster
Cluster
SystemAdmin
test
KSQL
Cluster
ResourceOwner
test
Schema Registry
Cluster
SystemAdmin
Grant the read-only permissions on the
ksql_testtopic to the default user of ksqlDBksql.Username
Cluster type
Resource type
Role
ksql
Kafka cluster
Topic
DeveloperRead
Procedure
Log on to the ApsaraMQ for Confluent console. In the left-side navigation pane, click Instances.
In the top navigation bar, select the region where the instance that you want to manage resides. On the Instances page, click the name of the instance.
In the upper-right corner of the Instance Details page, click Log on to Control Center to log on to Control Center.
On the Home page of Control Center, click the controlcenter.clusterk card to go to the Cluster overview page.

In the left-side navigation pane, click ksqlDB. Then, click the name of the ksqlDB cluster that you want to manage.
On the cluster details page, click the Editor tab. You can create streams and use ksql commands to query data on the tab. For more information, see Quick Start.
Create a stream
CREATE STREAM ksql_test_stream WITH (KAFKA_TOPIC='ksql_test',VALUE_FORMAT='AVRO');Query data from a stream
SELECT * FROM ksql_test_stream EMIT CHANGES;
Test and verify message sending
Enable stream data query.
On the ksqldb page, click the Editor tab, enter the following query statement, and then click Run query.
SELECT * FROM ksql_test_stream EMIT CHANGES;
Send a test message.
Open a new Control Center window.
On the details page of the
ksql_testtopic, click the Messages tab, and then click Produce a new message.In the Produce a new message panel, enter the message content and click Produce.
{ "id": "Tome", "amount": 18 }
Verify message sending.
In the previously opened stream data query window, you can see the sent test message is queried.
