Redash is an open source business intelligence (BI) tool that supports various data sources, such as MySQL and PostgreSQL. Redash provides an intuitive web interface that you can use to explore and visualize data from different databases. This topic describes how to use Redash to connect to AnalyticDB for MySQL.
Prerequisites
Redash is installed. For more information, see the Redash official documentation.
The IP address of the server where Redash is running has been added to the whitelist of the AnalyticDB for MySQL cluster. For more information, see Whitelist.
When you connect to a Spark JDBC endpoint, you must also create a Spark Interactive resource group.
Connect Redash to an ADB MySQL endpoint
Run Redash. In the address bar of your browser, enter
http://<IP address>:<Port number>to access the Redash web interface.IP address: The IP address of the server where Redash is running.Port number: The default port is 5000. If this port is already in use, you can change the port number when you run Redash.In the navigation pane on the left, click Settings. On the Data Sources tab, click +New Data Source.

In the Create a New Data Source dialog box, configure the following parameters and click Create.

Parameter
Description
Type Selection
The type of the data source. From the drop-down list, select MySQL.
Configuration
Name
The name of the data source. You can enter a custom name.
Host
The endpoint of the AnalyticDB for MySQL cluster.
If Redash is installed on an Elastic Compute Service (ECS) instance, and the ECS instance is in the same virtual private cloud (VPC) as the AnalyticDB for MySQL cluster, enter an internal endpoint.
If Redash is installed on a local server, enter a public endpoint.
Port
The value is fixed at 3306.
User
The database account of the AnalyticDB for MySQL cluster.
Password
The password of the database account for the AnalyticDB for MySQL cluster.
Database Name
The database in the AnalyticDB for MySQL cluster.
In the navigation pane on the left, choose . In the editor on the page, enter an SQL statement and click Execute.

After the SQL statement is executed, click + Add Visualization to create a visualization chart of the query results. For more information, see the Redash official documentation.

Connect Redash to a Spark JDBC endpoint
Prepare the environment
(Optional) Copy the adb_spark.png file to the destination path.
If you deploy Redash from source code, the destination path is
<root directory of the source code>/client/app/assets/images/db-logos/.If you deploy Redash from a Docker image, the destination path is
</app folder>/client/dist/images/db-logos/.NoteThe
</app folder>refers to the /app folder in theredash_server,redash_scheduler,redash_adhoc_worker, andredash_scheduled_workercontainers.
Copy the adb_spark_ds.py file to the destination path.
If you deploy Redash from source code, the destination path is
<root directory of the source code>/redash/query_runner/.If you deploy Redash from a Docker image, the destination path is
</app folder>/redash/query_runner/.NoteThe
</app folder>refers to the /app folder in theredash_server,redash_scheduler,redash_adhoc_worker, andredash_scheduled_workercontainers.
Add the
'redash.query_runner.adb_spark_ds'configuration parameter to the__init__.pyfile.Path of the
__init__.pyfile:If you deploy Redash from source code, the path is
<root directory of the source code>/redash/settings/__init__.py.If you deploy Redash from a Docker image, the path is
</app folder>/redash/settings/__init__.py.NoteThe
</app folder>refers to the /app folder in theredash_server,redash_scheduler,redash_adhoc_worker, andredash_scheduled_workercontainers.The parameter uses the following format:
default_query_runners = [ 'redash.query_runner.athena', 'redash.query_runner.big_query', ........ 'redash.query_runner.uptycs', 'redash.query_runner.adb_spark_ds' ]
Procedure
Access the Redash web interface. In the navigation pane on the left, click Settings. On the Data Sources tab, click +New Data Source.

In the Create a New Data Source dialog box, configure the following parameters and click Create.

Parameter
Description
Type Selection
The type of the data source. From the drop-down list, select ADB Spark.
Configuration
Name
The name of the data source. You can enter a custom name.
Host
The endpoint of the Spark Interactive resource group. To learn how to obtain the endpoint, see Preparations.
Port
The port of the Spark Interactive resource group. The value is fixed at 10000.
Database
The database in the AnalyticDB for MySQL cluster.
Username
The database account of the AnalyticDB for MySQL cluster.
Password
The password of the database account for the AnalyticDB for MySQL cluster.
Resource Group
The name of the Spark Interactive resource group.
In the navigation pane on the left, choose . In the editor on the page, enter an SQL statement and click Execute.

After the SQL statement is executed, click + Add Visualization to create a visualization chart of the query results. For more information, see the Redash official documentation.
