This topic describes how to quickly get started with DLA Ganos in DLA.

Procedure

  1. Prepare test data.
    Upload the required TIFF files to a specified OSS directory. Example:
    oss://Bucket name/raster
  2. Load OSS data.
    1. Create the Maven project dla-ganos-quickstart and edit the pom.xml file of the project.
      <? xml version="1.0" encoding="UTF-8"? >
      <project xmlns="http://maven.apache.org/POM/4.0.0"
               xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
               xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
          <modelVersion>4.0.0</modelVersion>
      
          <groupId>com.aliyun.ganos.dla</groupId>
          <artifactId>dla-ganos-quickstart</artifactId>
          <version>1.0</version>
          <properties>
              <scala.version>2.11.12</scala.version>
              <scala.binary.version>2.11</scala.binary.version>
              <scala.xml.version>1.0.6</scala.xml.version>
              <scala.parsers.version>1.0.6</scala.parsers.version>
              <scalalogging.version>3.8.0</scalalogging.version>
              <spark.version>2.4.3</spark.version>
              <kryo.version>3.0.3</kryo.version>
          </properties>
          <dependencies>
              <dependency>
                  <groupId>com.aliyun.ganos</groupId>
                  <artifactId>dla-ganos-sdk</artifactId>
                  <version>1.0</version>
                  <scope>system</scope>
                  <systemPath>
                      Directory to which the downloaded package dla-ganos-sdk-1.0.jar is stored
                  </systemPath>
              </dependency>
              <dependency>    
                  <groupId>io.spray</groupId>    
                  <artifactId>spray-json_2.11</artifactId>    
                  <version>1.3.5</version>
              </dependency>
              <dependency>
                  <groupId>org.apache.spark</groupId>
                  <artifactId>spark-core_${scala.binary.version}</artifactId>
                  <exclusions>
                      <exclusion>
                          <groupId>com.fasterxml.jackson.core</groupId>
                          <artifactId>jackson-databind</artifactId>
                      </exclusion>
                  </exclusions>
                  <version>${spark.version}</version>
                  <scope>provided</scope>
              </dependency>
              <dependency>
                  <groupId>org.apache.spark</groupId>
                  <artifactId>spark-sql_${scala.binary.version}</artifactId>
                  <version>${spark.version}</version>
                  <scope>provided</scope>
              </dependency>
              <dependency>
                  <groupId>org.apache.spark</groupId>
                  <artifactId>spark-catalyst_${scala.binary.version}</artifactId>
                  <version>${spark.version}</version>
                  <scope>provided</scope>
              </dependency>
               <! -- GeoTools -->
              <dependency>
                  <groupId>org.geotools</groupId>
                  <artifactId>gt-geojson</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>org.geotools</groupId>
                  <artifactId>gt-metadata</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>org.geotools</groupId>
                  <artifactId>gt-referencing</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>org.geotools.jdbc</groupId>
                  <artifactId>gt-jdbc-postgis</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>org.geotools</groupId>
                  <artifactId>gt-epsg-hsql</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>com.aliyun.oss</groupId>
                  <artifactId>aliyun-sdk-oss</artifactId>
                  <version>3.9.0</version>
              </dependency>
          </dependencies>
      
          <build>
              <plugins>
                  <plugin>
                      <groupId>org.apache.maven.plugins</groupId>
                      <artifactId>maven-compiler-plugin</artifactId>
                      <configuration>
                          <source>1.8</source>
                          <target>1.8</target>
                      </configuration>
                  </plugin>
                  <plugin>
                      <groupId>net.alchim31.maven</groupId>
                      <artifactId>scala-maven-plugin</artifactId>
                      <executions>
                          <execution>
                              <id>scala-compile-first</id>
                              <phase>process-resources</phase>
                              <goals>
                                  <goal>add-source</goal>
                                  <goal>compile</goal>
                              </goals>
                          </execution>
                          <execution>
                              <id>scala-test-compile</id>
                              <phase>process-test-resources</phase>
                              <goals>
                                  <goal>testCompile</goal>
                              </goals>
                          </execution>
                      </executions>
                      <configuration>
                          <compilerPlugins>
                              <compilerPlugin>
                                  <groupId>org.spire-math</groupId>
                                  <artifactId>kind-projector_2.11</artifactId>
                                  <version>0.9.4</version>
                              </compilerPlugin>
                          </compilerPlugins>
                      </configuration>
                  </plugin>
              </plugins>
          </build>
           <repositories>
              <repository>
                  <id>osgeo</id>
                  <name>OSGeo Release Repository</name>
                  <url>https://repo.osgeo.org/repository/release/</url>
                  <snapshots><enabled>false</enabled></snapshots>
                  <releases><enabled>true</enabled></releases>
              </repository>
              <repository>
                  <id>osgeo-snapshot</id>
                  <name>OSGeo Snapshot Repository</name>
                  <url>https://repo.osgeo.org/repository/snapshot/</url>
                  <snapshots><enabled>true</enabled></snapshots>
                  <releases><enabled>false</enabled></releases>
              </repository>
          </repositories>
      </project>
    2. Create a scala file named OSSTest.scala.
      import com.aliyun.ganos.dla._
      import com.aliyun.ganos.dla.raster._
      import com.aliyun.ganos.dla.oss._
      import com.aliyun.ganos.dla.geometry._
      import com.typesafe.config.ConfigFactory
      import org.apache.log4j.{ Level, Logger}
      import org.apache.spark.SparkConf
      import org.apache.spark.sql.SparkSession
      
      object OSSTest extends App {
      
        Logger.getLogger("org").setLevel(Level.ERROR)
        Logger.getLogger("com").setLevel(Level.ERROR)
        val spark: SparkSession = {
          val session = SparkSession.builder
            .withKryoSerialization
            .config(additionalConf)
            .getOrCreate()
          session
        }
      
        spark.withGanosGeometry
        spark.withGanosRaster
      
        val uri = new java.net.URI("oss://Bucket name/raster") //The directory of the folder.
      
        val options = Map(
          "crs"->"EPSG:4326",
          "endpoint" -> "Endpoint",
          "accessKeyId" -> "Your AccessKey ID",
          "accessKeySecret" -> "Your AccessKey secret")
        
        val rf = spark.read.ganos.oss(options).loadLayer(uri)
        rf.show
        def additionalConf = new SparkConf(false)
      }
    3. Compile the project.
      mvn clean package
  3. Submit a job.

    Log on to the Data Lake Analytics console to submit a Spark job. For more information, see Create and run Spark jobs.

    View the job status after the job is successfully executed. The original TIFF files are loaded as tiles to Spark. You can manage these tiles.