I have installed Pig 0.12 in my machine. when I run
darwin$ pig
grunt> ls /data/
hdfs://Nmame:10001/data/pg20417.txt<r 3> 674570
hdfs://Nname:10001/data/pg4300.txt<r 3> 1573150
hdfs:/Nname:10001/data/pg5000.txt<r 3> 1423803
hdfs://Nname:10001/data/weather <dir>
but when I try to create a query, I get the following error:
grunt> book = load '/data/pg4300.txt' as (lines:chararray);
2014-06-30 17:40:08,939 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1000: Error during parsing. Encountered " <PATH> "book=load "" at line 2, column 1.
Was expecting one of:
<EOF>
"cat" ...
"clear" ...
"fs" ...
"sh" ...
"cd" ...
"cp" ...
"copyFromLocal" ...
"copyToLocal" ...
"dump" ...
"\\d" ...
"describe" ...
"\\de" ...
"aliases" ...
"explain" ...
"\\e" ...
"help" ...
"history" ...
"kill" ...
"ls" ...
"mv" ...
"mkdir" ...
"pwd" ...
"quit" ...
"\\q" ...
"register" ...
"rm" ...
"rmf" ...
"set" ...
"illustrate" ...
"\\i" ...
"run" ...
"exec" ...
"scriptDone" ...
"" ...
"" ...
<EOL> ...
";" ...
Details at logfile: /Users/Documents/pig_1404175088198.log
I tried changingload
to LOAD
and as
to AS
but nothing worked.
Now load the data from the file student_data. txt into Pig by executing the following Pig Latin statement in the Grunt shell. grunt> student = LOAD 'hdfs://localhost:9000/pig_data/student_data.txt' USING PigStorage(',') as ( id:int, firstname:chararray, lastname:chararray, phone:chararray, city:chararray );
Apache Pig is a high-level data flow platform for executing MapReduce programs of Hadoop. The language used for Pig is Pig Latin. The Pig scripts get internally converted to Map Reduce jobs and get executed on data stored in HDFS. Apart from that, Pig can also execute its job in Apache Tez or Apache Spark.
I ran into the same issue and was looking for a solution. Turns out this happens if you do not give space. book=load
will give you an error. book = load
will work. I am not sure if this is an expected behavior.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With