This topic describes how to access ApsaraDB RDS databases in a VPC over an elastic network interface (ENI) by using Serverless Spark.

Prerequisites

Serverless Spark needs to access your VPC. For information about how to allow Serverless Spark to access a VPC, see Access your VPC.

Configure a whitelist of ApsaraDB RDS

Add the CIDR block of the VSwitch where the ENI resides to a whitelist of ApsaraDB RDS. Alternatively, add the security group to which the ENI belongs to the security group of ApsaraDB RDS.

Compile test data in an ApsaraDB RDS database

For more information, see Use DMS to log on to an ApsaraDB RDS for SQL Server instance. The following code shows the test data:
CREATE TABLE `persons` (
  `id` int(11) DEFAULT NULL,
  `first_name` varchar(32) DEFAULT NULL,
  `laster_name` varchar(32) DEFAULT NULL,
  `age` int(11) DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8
;

insert into persons VALUES(1,'a','b',5);
insert into persons VALUES(2,'c','d',6);
insert into persons VALUES(3,'e','f',7);

Compile a Spark application to access the ApsaraDB RDS database

A Spark application can access the ApsaraDB RDS database and tables by using Serverless Spark based on the imported ApsaraDB RDS database and table information and then display the information. The following information is an example:
package com.aliyun.spark

import org.apache.spark.sql.SparkSession

object SparkRDS {

  def main(args: Array[String]): Unit = {
    val sparkSession = SparkSession.builder()
      .appName("rds test")
      .getOrCreate()

    val url = args(0)
    val dbtable = args(1)
    val user = args(2)
    val password = args(3)

    val jdbcDF = sparkSession.read
      .format("jdbc")
      .option("url", url)
      .option("driver", "com.mysql.jdbc.Driver")
      .option("dbtable", dbtable)
      .option("user", user)
      .option("password", password)
      .load()

    jdbcDF.show()
  }

}

Upload files to OSS

Compile and package the code, and upload both the JAR package of the Spark application and the MySQL driver dependency to OSS. For information about how to obtain the MySQL driver dependency, see Download address of the MySQL driver dependency.

For more information, see Upload an object.

Submit a job

Compile a Spark-Submit script in Serverless Spark. For more information, see Create and run Spark jobs. The following code shows the script:
{
    "args": [
        "jdbc:mysql://Enter the URL of your ApsaraDB RDS database",
        "persons",
        "spark",
        "Enter your password"
    ],
    "name": "changqing-dla-test",
    "jars": [
        "oss://changqing-dla-test/mysql-connector-java.jar"
    ],
    "file": "oss://changqing-dla-test/rds_test.jar",
    "className": "com.aliyun.spark.SparkRDS",
    "conf": {
        "spark.dla.eni.enable": "true",
        "spark.dla.eni.vswitch.id": "Enter the ID of the VSwitch you selected",
        "spark.dla.eni.security.group.id": "Enter the ID of the security group you selected",
        "spark.driver.resourceSpec": "medium",
        "spark.executor.instances": 1,
        "spark.executor.resourceSpec": "medium"
    }
}