hdfs命令常用操作:
hdfs帮助
-help [cmd] 显示命令的帮助信息
[[email protected] ~]$ hdfs dfs -help ls
递归显示当前目录下的所有文件:
[[email protected] ~]$ hdfs dfs -ls -h / Found 1 items drwxrwx--- - hadoop supergroup 0 2017-11-23 13:09 /tmp [[email protected] ~]$ hdfs dfs -ls -h -R / drwxrwx--- - hadoop supergroup 0 2017-11-23 13:09 /tmp drwxrwx--- - hadoop supergroup 0 2017-11-23 13:09 /tmp/hadoop-yarn drwxrwx--- - hadoop supergroup 0 2017-11-23 13:09 /tmp/hadoop-yarn/staging drwxrwx--- - hadoop supergroup 0 2017-11-23 13:09 /tmp/hadoop-yarn/staging/history drwxrwx--- - hadoop supergroup 0 2017-11-23 13:09 /tmp/hadoop-yarn/staging/history/done drwxrwxrwt - hadoop supergroup 0 2017-11-23 13:09 /tmp/hadoop-yarn/staging/history/done_intermediate
-du 显示目录中所有文件大小:
[[email protected] ~]$ hdfs dfs -du -s -h /tmp/ 0 /tmp [[email protected] ~]$
-count计算路径下的目录、文件和字节数
[[email protected] ~]$ hdfs dfs -count -q -h /tmp/ none inf none inf 6 0 0 /tmp [[email protected] ~]$
-mkdir 在指定位置创建一个hdfs目录.
[[email protected] ~]$ hdfs dfs -mkdir testdhadoop
递归创建目录:
[[email protected] bin]$ ./hdfs dfs -mkdir -p /test1/test2/test3 #递归查看当前目录下所有文件. [[email protected] bin]$ ./hdfs dfs -ls -R /test1 drwxr-xr-x - hadoop supergroup 0 2017-11-23 15:17 /test1/test2 drwxr-xr-x - hadoop supergroup 0 2017-11-23 15:17 /test1/test2/test3
-mv 移动多个文件目录到目标目录,(移动的文件也需要是hdfs目录中存在的文件.)
[[email protected] ~]$ hdfs dfs -mv /tmp/hadoop-yarn /user/hadoop/testdhadoop
-cp 复制多个dhfs文件到目标目录
[[email protected] ~]$ hdfs dfs -cp /user/hadoop/testdhadoop /tmp/hadoop-yarn
-put 本地文件复制到hdfs
[[email protected] ~]$ hdfs dfs -put /etc/passwd /user/hadoop/testdhadoop
-copyFromLocal 与- put 命令相同.
[[email protected] ~]$ hdfs dfs -copyFromLocal /etc/yum.conf /user/Hadoop [[email protected] ~]$ hdfs dfs -ls -R /user/hadoop drwxr-xr-x - hadoop supergroup 0 2017-11-23 14:37 /user/hadoop/testdhadoop drwxrwx--- - hadoop supergroup 0 2017-11-23 13:09 /user/hadoop/testdhadoop/hadoop-yarn drwxrwx--- - hadoop supergroup 0 2017-11-23 13:09 /user/hadoop/testdhadoop/passwd -rw-r--r-- 2 hadoop supergroup 969 2017-11-23 14:41 /user/hadoop/yum.conf
-moveFromLocal 本地文件移动到 hdfs.
[[email protected] ~]$ hdfs dfs -mkdir /logs [[email protected] ~]$ hdfs dfs -ls -d /logs drwxr-xr-x - hadoop supergroup 0 2017-11-23 14:47 /logs [[email protected] ~]$ hdfs dfs -moveFromLocal test.txt /logs [[email protected] ~]$ hdfs dfs -ls -h /logs Found 1 items -rw-r--r-- 2 hadoop supergroup 12 2017-11-23 14:49 /logs/test.txt
get [-ignoreCrc] 复制hdfs文件到本地,可以忽略crc校验.
[[email protected] ~]$ hdfs dfs -get /logs/test.txt /tmp/
– copyToLocal 与- get命令相同 复制dhfs文件到本地.
[[email protected] bin]$ ./hdfs dfs -copyToLocal /logs/test.txt /home/hadoop/ [[email protected] ~]$ ls -lh /home/hadoop/ total 16K drwxrwxr-x. 4 hadoop hadoop 4.0K Nov 23 12:12 dfs drwxr-xr-x. 11 hadoop hadoop 4.0K Nov 23 12:47 hadoop -rw-r--r--. 1 hadoop hadoop 12 Nov 23 15:05 test.txt drwxrwxr-x. 3 hadoop hadoop 4.0K Nov 23 12:48 tmp
– cat 在终端显示文件内容
[[email protected] /]$ hdfs dfs -cat /logs/test.txt hello world [[email protected] /]$
– text 在终端显示文件内容,将源文件输出为文本格式。允许的格式是zip和TextRecordInputStream.
[[email protected] bin]$ ./hdfs dfs -text /logs/test.txt hello world [[email protected] /]$ hdfs dfs -tail /logs/part-00000 (查看文件的最后一千行) [[email protected] /]$ hdfs dfs -cat /logs/part-00000 | head
– touchz 创建一个hdfs空文件.
[[email protected] bin]$ ./hdfs dfs -touchz /test1/1.txt [[email protected] bin]$ ./hdfs dfs -ls -R /test1 -rw-r--r-- 2 hadoop supergroup 0 2017-11-23 15:20 /test1/1.txt drwxr-xr-x - hadoop supergroup 0 2017-11-23 15:17 /test1/test2 drwxr-xr-x - hadoop supergroup 0 2017-11-23 15:17 /test1/test2/test3
– getmerge [addnl] 将hdfs源目录中的所有文件排序合并到一个本地文件中,接受一个源目录和一个目标文件作为输入,并且将源目录中所有的文件连接成本地目标文件。addnl是可选的,用于指定在每个文件结尾添加一个换行符.
#将hdfs上的/logs/* 下的所有文件合并下载到本地的/tmp/hello文件中. [[email protected] bin]$ ./hdfs dfs -getmerge /logs/* /tmp/hello [[email protected] bin]$ cat /tmp/hello 111111111111111111111111 hello world [[email protected] bin]$
– grep 从hdfs上过滤包含某个字符的行内容
[[email protected] bin]$ ./hdfs dfs -cat /logs/* | grep 过滤字段
参考文档:http://blog.csdn.net/zhaojw_420/article/details/53161624
原创文章,作者:ItWorker,如若转载,请注明出处:https://blog.ytso.com/9197.html