All Products
Search
Document Center

Object Storage Service:Use Hadoop Shell commands to access OSS-HDFS

Last Updated:Mar 20, 2026

OSS-HDFS exposes a Hadoop Distributed File System (HDFS)-compatible interface, letting you use standard hdfs dfs commands to upload, download, list, copy, move, and delete objects in an OSS-HDFS-enabled bucket — no code changes required.

Prerequisites

Before you begin, ensure that you have one of the following environments set up:

  • Alibaba Cloud E-MapReduce (EMR) cluster — version 3.46.2 or later, or version 5.12.2 or later. EMR clusters that meet the version requirements come with OSS-HDFS integrated by default. See Create a cluster

  • Standalone environment — JindoSDK 4.6.x or later, installed and deployed. See Deploy JindoSDK in an environment other than EMR

Endpoint format

All OSS-HDFS paths follow this format:

oss://<bucket-name>.<region-id>.oss-dls.aliyuncs.com/<path>

For example, the root of examplebucket in the cn-hangzhou region is:

oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/
The OSS-HDFS domain suffix is oss-dls.aliyuncs.com, not the standard OSS suffix oss.aliyuncs.com.

Commands

OperationFlagSection
Upload a file-putUpload a file
Create a directory-mkdirCreate a directory
List objects and directories-lsList objects and directories
Check sizes-duCheck sizes
View object content-catView object content
Copy an object or directory-cpCopy an object or directory
Move an object or directory-mvMove an object or directory
Download an object-getDownload an object
Delete objects or directories-rm -rDelete objects or directories

Upload a file

Upload a local file to the root of examplebucket:

hdfs dfs -put examplefile.txt oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/

Create a directory

Create a directory named dir/ in examplebucket:

hdfs dfs -mkdir oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/dir/

Use -p to create parent directories if they do not exist:

hdfs dfs -mkdir -p oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/parent/child/dir/

List objects and directories

List all objects and directories in examplebucket:

hdfs dfs -ls oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/

Each result line shows: permissions replicas owner group size date time path

Use -R to list recursively:

hdfs dfs -ls -R oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/

Check sizes

Check the size of all objects and directories in examplebucket:

hdfs dfs -du oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/

Use -s to show the total size as a summary:

hdfs dfs -du -s oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/

View object content

Print the content of localfile.txt to the terminal:

hdfs dfs -cat oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/localfile.txt
Important

Content is displayed in plain text. If the object content is encoded, use the HDFS API for Java to read and decode it.

Copy an object or directory

Copy subdir1 to subdir2/subdir1 within the same bucket. The source directory, its objects, and subdirectory structure remain in place.

hdfs dfs -cp oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/subdir1 oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/subdir2/subdir1

Move an object or directory

Move srcdir and all its contents to destdir:

hdfs dfs -mv oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/srcdir oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/destdir

Download an object

Download exampleobject.txt from examplebucket to the local /tmp/ directory:

hdfs dfs -get oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/exampleobject.txt /tmp/

Delete objects or directories

Delete destfolder/ and all objects inside it:

hdfs dfs -rm -r oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/destfolder/

What's next