I tried to write a file to my local HDFS setup using a java program I am using Hadoop 2.3.0
distribution and hadoop-client 2.3.0
hadoop-hdfs 2.3.0
libraries.
In the HDFS log it shows the following error:
2014-04-07 18:40:44,479 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: prabhathp:50010:DataXceiver error processing unknown operation src: /127.0.0.1:38572 dest: /127.0.0.1:50010
java.io.IOException: Version Mismatch (Expected: 28, Received: 26738 )
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:54)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:198)
at java.lang.Thread.run(Thread.java:744)
Can somebody explain this?
If the error Version Mismatch (Expected: 28, Received: 26738 )
is seen intermittently with a very high Received
-Version, the cause can be that an application that does not use the hadoop rpc protocoll has connected to the datenode port.
We see this error for instance when somebody access the datanode url with a web browser (while intending to access the web-interface).
A misconfiguration can have similar effects.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With