hdfs-shell操作详解程序员

查看hdfs文件系统指定路径的文件:

[[email protected] ~]# hdfs dfs -ls /

Found 4 items

drwxr-xr-x  – root supergroup          02015-11-05 03:31 /data

drwxr-xr-x  – root supergroup          02015-11-05 03:32 /output

drwxrwx—  – root supergroup          02015-11-05 07:51 /tmp

drwxr-xr-x  – root supergroup          02015-11-06 23:59 /usr

 

在hdfs文件系统中创建文件夹

[[email protected] ~]# hdfs dfs -mkdir /data/mydata

[[email protected] ~]# gedit test.txt

[[email protected] ~]# hdfs dfs -put test.txt/data/mydata   //将本地文件上传到hdfs文件系统中

 

[[email protected] ~]# hdfs dfs -dus/data/mydata/test.txt  //查看文件大小

dus: DEPRECATED: Please use ‘du -s’instead.

73 /data/mydata/test.txt

 

[[email protected] ~]# hdfs dfs -du /data/mydata   //查看文件夹中文件大小

73 /data/mydata/test.txt

 

[[email protected] ~]# hdfs dfs -cp/data/mydata/test.txt /usr/    //从源位置复制到指定位置

 

[[email protected] ~]# hdfs dfs -copyFromLocaltest.txt /tmp  //将本地文件上传到hdfs文件系统,相当于put

 

[[email protected] ~]# mkdir data

[[email protected] ~]# cd data

[[email protected] data]# hdfs dfs -copyToLocal/tmp/test.txt   //将hdfs文件系统的文件复制到本地,相当于get

 

[[email protected] data]# ls

test.txt

[[email protected] data]# rm test.txt

rm: remove regular file `test.txt’? y

[[email protected] data]# ls

[[email protected] data]# hdfs dfs -get/tmp/test.txt  //将hdfs文件系统的文件复制到本地

[[email protected] data]# ls

test.txt

[[email protected] data]# cat test.txt

hello world

hello hadoop

spark scala

centos linux

hello linux

bye hadoop

[[email protected] data]# ls

test.txt

[[email protected] data]# hdfs dfs -put test.txt/output

[[email protected] data]# hdfs dfs -ls /output

Found 1 items

-rw-r–r–  2 root supergroup         732015-11-09 04:37 /output/test.txt

[[email protected] data]# hdfs dfs -touchz/tmp/mydata.txt  //创建一个空文件

[[email protected] data]# hdfs dfs -ls /tmp

Found 4 items

drwxrwx—  – root supergroup          02015-11-05 03:26 /tmp/hadoop-yarn

drwx-wx-wx  – root supergroup          02015-11-05 07:51 /tmp/hive

-rw-r–r–  2 root supergroup          02015-11-09 04:38 /tmp/mydata.txt

-rw-r–r–  2 root supergroup         732015-11-09 04:34 /tmp/test.txt

 

[[email protected] data]# hdfs dfs -tail/tmp/test.txt  //返回文件系统中指定文件1kb大小的内容

hello world

hello hadoop

spark scala

centos linux

hello linux

bye hadoop

 

[[email protected] data]# hdfs dfs -ls /output

Found 1 items

-rw-r–r–  2 root supergroup         732015-11-09 04:37 /output/test.txt

[[email protected] data]# hdfs dfs -rm/output/test.txt   //删除文件

15/11/09 04:48:06 INFOfs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 0minutes, Emptier interval = 0 minutes.

Deleted /output/test.txt

[[email protected] data]# hdfs dfs -mv/tmp/test.txt /output/  //在hsdfs文件系统中移动文件

[[email protected] data]# hdfs dfs -ls /tmp

Found 3 items

drwxrwx—  – root supergroup          02015-11-05 03:26 /tmp/hadoop-yarn

drwx-wx-wx  – root supergroup          02015-11-05 07:51 /tmp/hive

-rw-r–r–  2 root supergroup          02015-11-09 04:38 /tmp/mydata.txt

[[email protected] data]# hdfs dfs -ls /output

Found 1 items

-rw-r–r–  2 root supergroup         732015-11-09 04:34 /output/test.txt

[[email protected] data]# hdfs dfs -cat/output/test.txt  //查看文本文件内容

hello world

hello hadoop

spark scala

centos linux

hello linux

bye hadoop

[[email protected] data]# ls

test.txt

[[email protected] data]# rm test.txt

rm: remove regular file `test.txt’? y

[[email protected] data]# ls

[[email protected] data]# vim data.txt

[[email protected] data]# hdfs dfs -text/output/data.txt //查看文本文件内容

hadoop

linux

spark

scala

hive

hbase

zookeeper

sqoop

[[email protected] data]# hdfs dfs -moveFromLocaldata.txt /output  //从本地剪切到hdfs文件系统中

[[email protected] data]# hdfs dfs -ls /output

Found 2 items

-rw-r–r–  2 root supergroup         52 2015-11-09 04:53 /output/data.txt

-rw-r–r–  2 root supergroup         732015-11-09 04:34 /output/test.txt

[[email protected] data]#

原创文章,作者:ItWorker,如若转载,请注明出处:https://blog.ytso.com/7219.html

(0)
上一篇 2021年7月17日
下一篇 2021年7月17日

相关推荐

发表回复

登录后才能评论