File Data Load

  1. Files from a linux machine can be easily copied into HDFS cluster by using fs put command. This command is part of Hadoop client  which can be installed on any Linux machine. In our case, Hadoop client is available as part of Hadoop pseudo-distributed setup.

A general syntax of this command is as given:

hdfs dfs -put /local/path/test.file 
hdfs://namenode:9000/user/stage
  1. For this example, let us create a raw area of data in HDFS (a folder in HDFS). This area would contain the data in its most natural form as acquired from the source using the command:
hdfs dfs -mkdir -p /<any-path>/raw/txt

Once the previous command is executed, it will create the folder structure (<any-path>/raw/txt) in HDFS which can be viewed using the NameNode UI.

  1. Now change the directory into where the generated file of contacts exists and run the following command:
hdfs dfs -put  contacts.log hdfs://<hadoop-namenode-ip-address>:9000/<any-path>/raw/txt/contact.log
  1. The content of this file can now be viewed in Hue as shown in the following screenshot:
Figure 18: Text file loaded in HDFS via dfs put
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.17.79.206