Basic HDFS Commands | Hadoop | HDFS
4K views
Oct 24, 2024
Basic HDFS Commands
View Video Transcript
0:00
In this video we are discussing basic HDFS commands
0:05
So there are multiple different HDFS commands, so we'll be discussing some of them
0:10
So some basic HDFS comments are there. So HDFS commands are very similar with the Unix commands and some syntax and output
0:18
format may differ for the comments. So here we're having some set up comment list, LS
0:25
The description is that list all files with permissions and other details
0:29
MKDIR for the creation of a directory. RM, remove a file or a directory
0:38
Put, store file or folder from local hard disks to the HDFS
0:43
So, that is a put from the local harddicks to the HDFS
0:47
CAT, concatenate text or display file contents. Next one is the gate which is the opposite of put
0:55
Store file or folder from HDFS to the local disk. local disk. We are having this one as count. Count number of directory, number of files
1:05
and the file size. So, these are the set of commands and the respective descriptions were provided
1:11
and these commands will be going for further more details. So, some basic HDFS commands
1:18
so at first we are going for more detailing about the LS command. So, the syntax is HDFS
1:23
DFS minus LS, and then you can mention the path. So, this LFS. So this LAS, you are going to
1:28
L. S command will show all the files and folders which are given in the respective path
1:34
And it also displays the file permissions, the owner and group and modification time
1:40
file size, everything will be displayed. So, this is a command that is the L.S
1:44
How to issue this one, SDFS, DFS minus LAS path. Next we are going for the MKDIR command
1:53
So syntax will be HDFS DFS minus MKDIR path or the. directory name. Then we are having this the mkDII command is used to create a new directory inside
2:06
the given location. So, the new directory name will be given here and the path will be there
2:12
under this particular path the new directory will be created. Obviously, to run these commands
2:18
start the Hadoop first to run the SDFS commands. Next, we are having this RM command. So
2:26
RM stands for remove. So the syntax is. sdf s dfs minus rm and then the file path. So syntax is sdfs, dfs minus rm. Another option might be used
2:38
that is a minus r and that is a directory name. So the RM command is used to remove a file
2:44
or a directory. If the directory has some elements, we need to use RM minus R. So in the case
2:51
it will go for recursive deletion. So option to recursively delete all the internal elements of that
2:58
folder or directory. Next one is the put command. So syntax will be HDFS, DFS minus
3:04
put, path one, path two, and then destination. So put comment is used to store data
3:11
into SDFS from the local disk. It can take multiple arguments and all are source except
3:18
the last one. You can have multiple source arguments, but the last one will be denoting the
3:23
destination. That is the destination path in the SDF. So, put means it is just transferring files from the local system, local file system to the HDFS
3:36
We're going for the next one, that is a gate. So, HDFS, DFS minus gate, path two, and then destination
3:46
Get command is used to store data from HDFS to the local disk
3:52
It can take multiple arguments, all the source locations in SDFS, except the last one, and
3:58
The last will be the destination path in the local HUDDix. So here this particular using this get command, you can transfer files from the HDFS to the
4:08
local file system. Next one is a count. Count is another command is there
4:12
So syntax will be HDFS, DFS minus count and path. So count comment is used to count the number of directories, files inside the given directory
4:22
whatever the path you are given. So where the number of count of the files and directories will be provided
4:28
and it also shows the file size of the directory. So, in this way, you have discussed so many commands of SDFS
4:35
So let us go for one practical demonstration for your easy and better understanding
4:41
So here is the demonstration for you. In this video, we shall discuss multiple different Hadoop HDFS comments
4:48
So at first, we're opening one terminal. So Control Alter T and we shall open 1G edit
4:56
We're going for the commands. okay so I'm just opening this one so here we'll be typing all the list of
5:03
commands whatever you are going to initiate so we shall show you the command like
5:08
LS then mkDIR then we shall go for say we shall go for a put copy from local
5:19
then we shall go for get copy to local so these are the commands will be showing you
5:28
then we shall go for say count then we shall go for rm and then say cat and also our move mv so
5:37
these are the commands very common commands we are going to show you one by one so we will be
5:43
showing you the l s mkdiir put copy from local we shall go for get then we shall go for copy to local
5:51
so remember here in this case t and l is capital so copy is all in lowercase letters
5:58
andый and L will be in capital so we shall go for this RM and then CAT and then move okay so
6:04
this is the list of commands so 10 commands we are finding so we'll be going to issue them so I'm
6:10
just opening another terminal here I want to put them side by side so these two
6:22
terminals are there so now let me introduce my my one folder so we are going to
6:28
for this home under the home we are having one folder that is a haduk my files so you see it
6:34
is in the ubunto linux folder it is not in sdfs so here we are having this haduk my files
6:39
having two files are there one is a sample file dot txt and another one is the student
6:45
underscore info c sb we'll be working on them we'll be putting these files onto the sdfs
6:51
folders and then from there we'll be getting them back and we'll be working on them
6:56
So what is the current content of this sample file.txta and student underscore info
7:01
So let me show you that one also. So we shall go for we shall so we are under this
7:08
So we are going for this cat. There is a home slash Hadoop
7:14
My files. So let us go for LS at first so that I can see that what are the files are there
7:22
So here we are having this. So these are the list of files. so minus l i shall go for this one as minus l so we're having sample underscore file dot txt
7:34
and student underscore info c sb so let me check their contents so i shall go for this cat
7:41
then home slash we shall go for haddup my files slash sample file dot tx t
7:55
so CAD Hadoop my files sample files dot txc so these are content so going for the other
8:06
one so that is our student underscore underscore info dot CSB so these two file contents are in front of us so one is the sample underscore file dot txt another one is the sample underscore file Another one is the student underscore info
8:27
So I'm going for this browser here. So in this browser I'm going for the respective folder
8:34
You can find that under the HDFS route we're having no such folder called Hadoop My files
8:41
So I shall create one folder in the SDFS route. So this is a root you can find that
8:46
we'll be creating one folder under it so how to create it so you know that we went
8:52
for to open this one we went for the local host colon 50070 and then slash we got it and
9:04
from utilities browse the file system from here we came here so i want to create one folder here
9:09
the name of the folder will be hard to my files so i'm just going to create that folder so for that
9:16
I shall be using the command that is the mkDIR I'll be using the command that is mkdiar so how to use that one see we can issue s dFS minus mkdiir so there is a command and then it will get created under the root of the sdfs
9:39
haduk my files I think the folder has got created let me go for
9:46
a check so it is not being shown I can go for refresh you can find that
9:51
Hadoop my files this folder has got created you see the folder is empty you see
9:56
the folder is empty so their folder has got created so I can create such
10:04
folders using another way that is the Hadoop there is a Hadoop FS and rest part
10:13
will remain the same so as already we have created this is Hadoop My File, so I'm going for another one
10:23
I'm going for say Hadoop My Files 1. So this is the folder name
10:28
So I'm giving here. So Hadoop FS minus MKTIR slash Hadoop My Files 1
10:36
So if I go on entering, so if you come to this, if I go for refresh
10:45
you can file that another. folder has got created so that is very simple so either you can use HDFS DFS or
10:53
you can use Hadoop F s so how to remove this folder how to remove this
10:58
Hadoop my files one right now I don't require it so s DFS DFS DFS minus
11:05
we can also go for RM okay RM RM RM minus R means it will go on recursive all
11:12
the folders or subfolders or files whatever whatever deciding in this had to my files 1 will get deleted so there is a recursive
11:21
minus r so we shall go for this haddub my files 1 so sdf s dfs minus r m minus r
11:32
is recursive had to my files 1 so if I press enter the folder has got deleted
11:40
so you see the I shall go for the refresh you see the had to my files 1
11:46
it has got deleted so how to see this listing also from this from this
11:51
terminal so we shall go for say is DFS then DFS then minus LS slash so slash
12:00
slash means the hard-to-root we can find so these are the folders are there
12:06
how do my files are there these are the hard of my files we're having and this is
12:11
mine so if I go for this one so these are how do my files you are getting but
12:16
the Hadoop My files 1 it has got it has got deleted removed already okay so we
12:23
have done up to this so we have gone through this mkdir we have shown you that
12:29
how to do this mk d ir and also we have shown you that how to use this RM okay
12:34
also have shown you that how to use this LS this LS can also be done in this way
12:41
that is a Hadoop Hadoop FAADU fs minus ls root so also we can use the same okay so this much we have done going for clear
12:57
now i want to copy those two files for which are the files so ls minus l so that is our root
13:04
there is a home folder hadupe my files I have to see all the files there so we're
13:14
having sample underscore file dot txt and student underscore info info c sb so already
13:20
i have created this folder here so i want to copy them to this particular folder so that means from the ubunto Linux file system to the sdfs i want to copy
13:29
so in that case what we can do we can issue the command is dFS then dfs then
13:40
minus put we can go for minus put okay so from where from the home from the home
13:49
we are going for home and then had to my files from the home we shall go forward
13:57
had to my file slash what is a file name that is a sample file
14:04
so under which folder under root haddub my files so under this folder I want to copy this respective file so that is the sdFS dFS minus put
14:20
we're going for the um Ubuntu Linux file system that is a local file system so root root
14:26
will be denoted by the tilt sign had to my files is the respective folder name sample
14:31
underscore file dot CSV uh is a sample underscore file dot txt i think it is the txd one so I shall go
14:39
for this one as tx t okay so there is a tx t and then it will be a copy to this folder
14:45
that is the root of the sdf s and then had to my files okay so let me check
14:53
whether it has been copied there or not so I shall go for this hadup fs minus
15:00
ls then I shall go for this one as hadup my files
15:09
you see the folder that is a file that is a sample underscore file dot txt has been copied
15:16
there okay so the command is put so instead of put i can also use copy from local so i'm just
15:24
erasing this put here i'm going for copy from local i shall go for this copy from local i'm sticking
15:32
with the same command but only the file name i'm going to change so here i shall go for this one as
15:37
student info c sv student info dot c sv you can easily find so one file we're
15:46
having there is a student info dot c sv so this file I'm going to copy here so
15:51
sdf s dfs minus copy from local so there is a root folder so it is a home
15:57
folder rather then had to my files student so there is a student
16:04
underscore info c sv to the respective sdfs folder that is the root had to my files so if
16:11
i go for this so copy from local is working so let me go for the ls i can find two files
16:20
have got copied there so one i copied using the command put another one i copied using the command
16:26
copy from local so put i have shown you also have shown you this copy from local here okay now let me
16:34
go for here you can see that if I click I finding both the files here okay now I want to make a copy of this say sample underscore file dot tx t a sample underscore file 1 so how to do that one
16:51
So let me come down to this here. So I shall go for this copy. Okay, see I shall go for
16:57
uh, hadu, s dFS, dFS minus CP. SdFS dFS minus CP. So root, haddued, hadoo
17:10
my files then I'm having say sample file.csb sample file dot CSV in the same
17:25
folder I'm going to copy others I can select other path also as we usually do
17:29
had do my files slash sample underscore file 1 dot CSV
17:40
So sample underscore file dot tx t rather okay so I shall go for this one is tx t so see the
17:49
command that is the sdf s dfs minus cp hadu my files sample underscore file dot tx t
17:58
to this hadu my files sample underscore file 1.t xt so here I'm going to show you the
18:03
command that is a cp here we're going to show you the command that is a cp so let me
18:08
press enter okay so let me go here you can find that go for this one you
18:16
can find that another file has got created there is a sample file 1 dot tx t okay so
18:22
now I want to rename this one student info dot says I want to rename this one
18:27
sample file 1 dot tx t to sample file 2 dot txt how do that one okay so let me
18:33
come back so another comment we have discussed here there is a CP
18:38
so we have covered that one so let me go for the renaming okay let me go for
18:43
the renaming so here what what we are happening so let me go for this
18:47
LS minus else this one so let me go for clear that will make the screen clear
18:54
yes now so LS minus L slash slash Hadoop my files how do my files
19:03
so these are the files I'll be working with so sample files
19:08
sample underscore file dot tx t it has got rename to sample underscore file 1 dot txt so I
19:16
want to I want to rename it okay see so I shall go for sdfs DFS minus move
19:26
so what is the path root hadu my files so I shall go for sample
19:38
underscore file 1. tx t to had to my files had to my files slash sample underscore files sample
19:54
underscore files let me check one what is file 1 dot t x i think let me check one's please so
20:00
there is a file 1.t x yes so now i shall come back so there is a sample files
20:08
2. dxt so that is a move we are going to do coming down here now it has got
20:19
renamed to files 2. txd so while doing this moving also we can make the file
20:26
move to the different path so let us go for say Hadoop Hadoop F.S minus mk DIR so under this
20:37
Hadoop my files I shall go for creation one the directory that is a
20:45
say directory one I'm making a directory that is a directory one here okay I
20:55
think I've done it let me check once so I shall come back here yes the
20:59
directory one which is empty right now which is empty so let me move that file
21:04
so I shall go for this move встретoop there is a file let me go for this so I shall make this one as file 2 dot txt
21:14
so this particular file I want to move to this directory one I want to move this
21:22
directory one okay let me check once so now if I go for this refresh if I go
21:32
for this refresh so file has not been copy it so let me check once again so I'm just making the file name as
21:44
sample underscore file 2 dot txd the sample underscore file 2 dot txt let me check
21:53
whether the file is existing there or not so that is a sample
21:57
underscore files 2. txt yes it is existing so here the file name should be
22:04
files 2 so that was a mistake so I I'm making this one as files 2
22:10
So that is a HDFS, DFS minus move. Haddub my files sample files 2. dxt
22:18
Haddub my files, DIR 1, I'm making this one say file 2
22:23
D x t. You can find that also I'm moving it to a different path
22:27
and also renaming it at the same time. So I'm going for this
22:33
So let me check once. So I'm going for this DIR 1. Yes, the file 2
22:36
D x.d. So in this way you can find that how this move command can be issued here
22:41
So move I have shown you. Next I shall go for the content
22:45
I want to show you the content here. So how to show you the content
22:49
So I shall go for this one as say Hadoop, fS, then minus GAT, then slash Hadoop, my files
23:05
So here we are having student info. You can find that we are having this student info.csv
23:14
You see the current content is being shown. Going for clear. So let me show you again
23:22
Using the same command, but here I shall go for HDFS. DFS
23:32
You can find that I'm getting the same content. that is our cat so I've shown you how to use this cat so this much we have done now
23:41
what I shall do I shall go for say for the removal of the directory also will be
23:47
using this RM I shall show that one next I shall go for this get now from the
23:51
SDFS file system I'll be copying our file to our local file system so for that
23:59
we'll be using the command get okay so let me check what are the files we're
24:04
having here okay now see so I shall go for this SDFS dFS minus LS so we're
24:22
having this one as hadu my files okay so these are the files we're the files we're
24:33
having there is a sample underscore file.txt and student underscore info c sb so for both the files are existing I want to copy them so to my local
24:43
folder so I'm just making this one in this way so before going there I'm just
24:48
showing you that what are what are the files I'm having here so I shall go for
24:52
this home then had do my files you can find that we are having this okay so So this is the file which is in the SDFS file system I want to copy it to this Ubuntu Linux That is a local file system with the renaming
25:13
So how to do? How to do? So we shall go for HDFS. DFS minus get slash
25:21
There is a Hadoop root. So Hadoop My Files. So what is the file name
25:31
there is a sample underscore file dot txt to where so I shall go for this go for
25:42
this one there's a home Hadoop my files I'm making this one as sample 1 dot tx t
25:56
okay so I'm just getting the Hadoop file system one file from the Hadoop
26:01
had to the file system to the local file system so see the command so that is our
26:05
is dFS dFS minus get slash hadu giles slash sample underscore file increasing to
26:14
the home directory hadug my files sample 1 dot txt so if I do the copy here so
26:23
let me see the current content you can find that sample 1 dot tx t has been
26:27
copied here okay now I shall go for the and another time I shall issue the same command but here I'll be doing some changes so
26:35
here I'll be copying the file that is a student info so I'm going for this student info
26:40
so student one dot CSV will be the file name okay so here I shall go for
26:52
Hadoop Fs so I want to show that also it will work so instead of get I shall use
27:00
use copy to local so I'm I'm showing you in all possible combinations so
27:08
Hadoop fs minus copy to local so slash Hadoop my files so what is the name
27:14
here so that is a student info dot CSB so I'm just making this one okay so
27:24
so here I shall go for this student info c sb so that is a hadoop fs
27:44
fs minus copy to local slash hudub my that is a sdfs root so root that is a
27:52
slash hadoop my files student info dot csv to the local folder so that is a home
27:58
then had to my files and then student one dot CSV so I'm just copying it so if I go
28:06
for the check you can find that that student one dot CSV the file is existing
28:14
here so in this way I have brought both the sample one dot txd student one dot
28:18
c sb from the had to the local file system so I have demonstrated the use of
28:24
this get and also the use of this copy to local So now let me erase the all the full content let me erase this content of this had to my file so I do not want to have such content here and to delete this folder
28:38
So you see this particular folder is going to get deleted so how to do that one so in that case I shall go for
28:46
So let me type type here so SDFS DFS minus RM minus R that is a recursive root there is a sdFS root so we shall go for this
28:58
Hadoop my files so you can see the all the all the files all the subfolders whichever
29:05
is existing within this Hadoop my files folder will get deleted along with the
29:11
Hadoop my files folder will get deleted itself so I shall go for this enter so now
29:19
let me check whether it is existing there or not so I'm going for the refresh you
29:23
can find that that Hadoop my files that folder has got deleted with its all
29:28
content so in this particular demonstration we have shown you that how to use the
29:33
commands like LS mkdir put copy from local get copy to local then RM CAT mv and
29:42
CP so we are remaining with the count okay we are remaining with the count so
29:46
how to go for the count we can go in this way you can go for say say I shall go
29:50
for say SDFS s dFS minus count now we can give the
29:58
route also you can see that how many how many folders are there how many files
30:03
are there you are getting the count along with the byte count you are getting okay so these are the different folder so let me make one folder so s dFS
30:11
s dFS so my dFS say minus mk d i'm making the folder again so i shall go
30:19
for say slash had do my files i'm making this folder at first
30:28
okay to this folder I'm copying two files as I did earlier so s DFS then I shall go
30:39
for this minus put I shall go for this minus put from where from our local system
30:46
home then haduk my files slash what is the name that is a sample file dot tx t
30:58
to had to my files so i'm just making a copy of this okay then i'm making another
31:11
file there so that is my student info whatever i did earlier i'm just repeating one second
31:18
so that you can also see csv
31:28
so let me go for the LS so hadduf fs minus LS slash haddup my files so we're
31:46
having this sample underscore file dot tx t so this particular student I think the
31:52
spelling was wrong so that's why so student info and c sb yes i think now it has got copied so let me go for the file check yes the two files are
32:08
there one is the sample underscore file dot txt and the one is the student underscore info
32:13
c sb so let me check it from here also so the had do my files is not there if i go yes
32:19
it is there and it is having two two files are there okay now let us issue the command that is
32:25
the count s dFS minus dFS dFS then i shall go for minus count haddub my files so it is showing that
32:44
it is having 294 bytes of information and here we're having two files are there
32:51
and there is one folder so in this way it is showing you the respective count
32:56
so there is a respective count is there so these are the two files are existing under
33:03
this my Hadoop my files folder so in this way I have demonstrated and the all the
33:10
commands that is the LS mkDIR put copy from local and then get copy to local then
33:17
count RM CAT MV and CP I think now the conception is getting clear to you how
33:24
to issue such commands in our Hadoop HDFS. Thanks for watching this video
#Programming