I am getting "not a valid JAR "error after making changes to StubDriver.java in simplewordcount.
Following changes under my repository
bigdata/hdpexamples/java/src/com/cloudxlab/simplewordcount/StubDriver.java
FileInputFormat.addInputPath(job, new Path(args[1]));
//FileOutputFormat.setOutputPath(job, new Path(args[2])); // amit comment
FileOutputFormat.setOutputPath(job, new Path("javaout_demo"));
[amit6539398@ip-172-31-38-183 bigdata]$ git pull
remote: Counting objects: 9, done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 9 (delta 5), reused 0 (delta 0), pack-reused 0
Unpacking objects: 100% (9/9), done.
From https://github.com/amit653/bigdata
f6a673b…5f60e44 master -> origin/master
Updating f6a673b…5f60e44
Fast-forward
hdpexamples/java/src/com/cloudxlab/simplewordcount/StubDriver.java | 4 +±-
1 file changed, 2 insertions(+), 2 deletions(-)
[amit6539398@ip-172-31-38-183 bigdata]$ pwd
/home/amit6539398/bigdata
[amit6539398@ip-172-31-38-183 bigdata]$ cd hdpexamples/java
[amit6539398@ip-172-31-38-183 java]$ ls
build build.xml lib README.md src
[amit6539398@ip-172-31-38-183 java]$ ant jar
Buildfile: /home/amit6539398/bigdata/hdpexamples/java/build.xml
compile:
[javac] /home/amit6539398/bigdata/hdpexamples/java/build.xml:14: warning: ‘includeantruntime’ was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
[javac] Compiling 1 source file to /home/amit6539398/bigdata/hdpexamples/java/build/classes
[javac] Note: /home/amit6539398/bigdata/hdpexamples/java/src/com/cloudxlab/simplewordcount/StubDriver.java uses or overrides a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
jar:
[jar] Building jar: /home/amit6539398/bigdata/hdpexamples/java/build/jar/hdpexamples.jar
BUILD SUCCESSFUL
Total time: 1 second
+++ here not sure why i get rmout exists error , finally i deleted it ?
[amit6539398@ip-172-31-38-183 java]$ hadoop jar build/jar/hdpexamples.jar com.cloudxlab.simplewordcount.StubDriver
WARNING: Use “yarn jar” to launch YARN applications.
18/01/18 16:06:44 INFO impl.TimelineClientImpl: Timeline service address: http://ip-172-31-13-154.ec2.internal:8188/ws/v1/timeline/
18/01/18 16:06:44 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-53-48.ec2.internal/172.31.53.48:8050
Exception in thread “main” org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://ip-172-31-53-48.ec2.internal:8020/user/amit6539398/javamrout already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at com.cloudxlab.wordcount.StubDriver.main(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
[amit6539398@ip-172-31-38-183 java]$ ls
build build.xml lib README.md src
[amit6539398@ip-172-31-38-183 com]$ cd cloudxlab/
[amit6539398@ip-172-31-38-183 cloudxlab]$ ls
chaining charcount customreader hiveudf nextword simplewordcount wordcount
[amit6539398@ip-172-31-38-183 cloudxlab]$ cd simplewordcount/
[amit6539398@ip-172-31-38-183 simplewordcount]$ ls
StubDriver.java StubMapper.java StubReducer.java StubTest.java
[amit6539398@ip-172-31-38-183 simplewordcount]$ ls -lrt
total 16
-rwxrwxr-x 1 amit6539398 amit6539398 3064 Jan 16 04:30 StubTest.java
-rwxrwxr-x 1 amit6539398 amit6539398 630 Jan 16 04:30 StubReducer.java
-rwxrwxr-x 1 amit6539398 amit6539398 685 Jan 16 04:30 StubMapper.java
-rwxrwxr-x 1 amit6539398 amit6539398 1585 Jan 18 15:59 StubDriver.java >>>>>>>> changes reflected
[amit6539398@ip-172-31-38-183 simplewordcount]$ hadoop fs -rm -r javamrout +++++++ this step was not needed as i created different output file
18/01/18 16:09:29 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 360 minutes, Emptier interval = 0 minutes.
Moved: ‘hdfs://ip-172-31-53-48.ec2.internal:8020/user/amit6539398/javamrout’ to trash at: hdfs://ip-172-31-53-48.ec2.internal:8020/user/amit6539398/.Trash/Current
[amit6539398@ip-172-31-38-183 hdpexamples]$ pwd
/home/amit6539398/bigdata/hdpexamples
[amit6539398@ip-172-31-38-183 hdpexamples]$ cd …
[amit6539398@ip-172-31-38-183 bigdata]$ hadoop jar build/jar/hdpexamples.jar com.cloudxlab.simplewordcount.StubDriver
WARNING: Use “yarn jar” to launch YARN applications.
Not a valid JAR: /home/amit6539398/bigdata/build/jar/hdpexamples.jar >>>>>>>>>>>>. now getting this error.