I installed hadoop in several laptops in order to form a hadoop cluster. First we installed in pseudo-distributed mode, and in all except one verything was perfect (i.e. all the services run, and when I do tests with hadoop fs
it shows the hdfs
). In the aftermentioned laptop (the one with problems) the `hadoop fs -ls
command shows the information of the local directory not the hdfs
, the same happens with the commands -cat
, -mkdir
, -put
. What could I be doing wrong?
Any help would be appreciated
Here is my core-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/hduser/hdfs_dir/tmp</value>
<description></description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310</value>
<description>.</description>
</property>
</configuration>
I must said, that this is the same file for all the other laptops, and they work fine.
I had the same problem, and I had to make sure fs.default.name
's value included a trailing /
to refer to the path component:
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310/</value>
<description>.</description>
</property>
check that fs.default.name
in core-site.xml
points to the correct datanode
in ex:
<property>
<name>fs.default.name</name>
<value>hdfs://target-namenode:54310</value>
</property>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With