Hadoop Command Summary
List file is:
$ ./hadoop fs -ls
Found 17 items
-rwxr-xr-x 1 yj70978 retailfi 1259 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/start-mapred.sh
-rwxr-xr-x 1 yj70978 retailfi 2642 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/hadoop-config.sh
-rwxr-xr-x 1 yj70978 retailfi 2810 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/rcc
-rwxr-xr-x 1 yj70978 retailfi 14189 2013-07-22 07:58/home/yj70978/hadoop/hadoop-1.1.2/bin/hadoop
-rwxr-xr-x 1 yj70978 retailfi 1329 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/hadoop-daemons.sh
-rwxr-xr-x 1 yj70978 retailfi 1145 2013-01-30 21:05 /home/yj70978/hadoop/hadoop-1.1.2/bin/start-jobhistoryserver.sh
-rwxr-xr-x 1 yj70978 retailfi 2143 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/slaves.sh
-rwxr-xr-x 1 yj70978 retailfi 1116 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-balancer.sh
-rwxr-xr-x 1 yj70978 retailfi 1745 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/start-dfs.sh
-rwxr-xr-x 1 yj70978 retailfi 1168 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-mapred.sh
-rwxr-xr-x 1 yj70978 retailfi 1246 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-dfs.sh
-rwxr-xr-x 1 yj70978 retailfi 1166 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/start-all.sh
-rwxr-xr-x 1 yj70978 retailfi 1119 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-all.sh
-rwxr-xr-x 1 yj70978 retailfi 63970 2013-01-30 21:06/home/yj70978/hadoop/hadoop-1.1.2/bin/task-controller
-rwxr-xr-x 1 yj70978 retailfi 1065 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/start-balancer.sh
-rwxr-xr-x 1 yj70978 retailfi 1131 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-jobhistoryserver.sh
-rwxr-xr-x 1 yj70978 retailfi 4649 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/hadoop-daemon.sh
HDFS has a defaultworking directory of /user/$USER, where $USER is your login user name. Thisdirectory isn’t automatically created for you, though, so let’s create it withthe mkdir command.
List all ofsubdirectories,
hadoop fs -lsr/user
Copy file from localfilesystem to HDFS,
$ hadoop fs -put1.txt /user/mz50947
No encryption was performed bypeer.
$ hadoop fs -ls/user/mz50947
No encryption was performed bypeer.
Found 2 items
-rw-r--r-- 3 mz50947 enterpriserisk 0 2013-07-23 01:39 /user/mz50947/1
-rw-r--r-- 3 mz50947 enterpriserisk 0 2013-07-23 01:40/user/mz50947/1.txt
Delete a file inHDFS,
$ hadoop fs -rm/user/mz50947/1.txt
No encryption was performed bypeer.
Moved:'hdfs://bdwar001m01l.nam.nsroot.net:8020/user/mz50947/1.txt' to trash at:hdfs://bdwar001m01l.nam.nsroot.net:8020/user/mz50947/.Trash/Current
See content of afile,
hadoop fs -cat/user/mz50947/1.txt
No encryption was performed bypeer.
Touch
Get a file fromHDFS to local filesystem,
hadoop fs -get/user/mz50947/1.txt .
Looking up help,
hadoop fs –help