All Products
Search
Document Center

Object Storage Service:Use Hadoop Shell commands to access OSS-HDFS

Last Updated:Dec 21, 2023

If you want to use the CLI to perform operations, such as uploading objects, downloading objects, and deleting objects, on a bucket for which OSS-HDFS is enabled, you can use Hadoop Shell commands.

Environment preparation

  • In the E-MapReduce (EMR) environment, JindoSDK is installed by default and can be directly used.
    Note To access OSS-HDFS, create a cluster of EMR 3.44.0 or later, or EMR 5.10.0 or later.
  • In a non-EMR environment, install JindoSDK first. For more information, see Deploy JindoSDK in an environment other than EMR.
    Note To access OSS-HDFS, deploy JindoSDK 4.6.x or later.

Commands and examples

The following section provides examples on how to use Hadoop Shell commands to access OSS-HDFS.

  • Upload objects

    Run the following command to upload a file named examplefile.txt in the local root directory to a bucket named examplebucket:

    hdfs dfs -put examplefile.txt oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/
  • Create directories

    Run the following command to create a directory named dir/ in a bucket named examplebucket:

    hdfs dfs -mkdir oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/dir/
  • Query objects or directories

    Run the following command to query the objects or directories in a bucket named examplebucket:

    hdfs dfs -ls oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/
  • Query the sizes of objects or directories

    Run the following command to query the sizes of all objects or directories in a bucket named examplebucket:

    hdfs dfs -du oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/
  • Query the object content

    Run the following command to query the content of an object named localfile.txt in a bucket named examplebucket:

    hdfs dfs -cat oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/localfile.txt
    Important The content of the queried object is displayed on the screen in plain text. If the content is encoded, use the HDFS API for Java to read and decode the content.
  • Copy objects or directories

    Run the following command to copy the root directory named subdir1 to a directory named subdir2 in a bucket named examplebucket. In addition, the position of the subdir1 root directory, the objects in the subdir1 root directory, and the structure and content of subdirectories in the subdir1 root directory remain unchanged.

    hdfs dfs -cp oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/subdir1  oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/subdir2/subdir1
  • Move objects or directories

    Run the following command to move the root directory named srcdir in a bucket named examplebucket and the objects and subdirectories in the root directory to another root directory named destdir:

    hdfs dfs -mv oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/srcdir  oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/destdir
  • Download objects

    Run the following command to download an object named exampleobject.txt from a bucket named examplebucket to a directory named /tmp in the root directory of your computer:

    hdfs dfs -get oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/exampleobject.txt  /tmp/
  • Delete directories or objects

    Run the following command to delete a directory named destfolder/ and all objects in the directory from a bucket named examplebucket:

    hdfs dfs -rm oss://examplebucket.cn-shanghai.oss-dls.aliyuncs.com/destfolder/