我正在尝试在 hadoop 中运行简单的 map reduce 示例,这是我的主程序:
Configuration configuration = new Configuration();
Job job = new Job(configuration, "conf");
job.setMapperClass(MapClass.class);
int numreducers = 1;
job.setNumReduceTasks(numreducers);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
FileInputFormat.addInputPath(job, new Path("/user/www/input"));
FileOutputFormat.setOutputPath(job, new Path("/user/www/output/"));
System.exit(job.waitForCompletion(true) ? 0 : 1);
我正在使用这些库 hadoop-2.0.0 和 guava 14.0.1,这个程序给出了异常:
Exception in thread "main" java.lang.IncompatibleClassChangeError: class com.google.common.cache.CacheBuilder$3 has interface com.google.common.base.Ticker as super class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at com.google.common.cache.CacheBuilder.<clinit>(CacheBuilder.java:190)
at org.apache.hadoop.hdfs.DomainSocketFactory.<init>(DomainSocketFactory.java:46)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:456)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:410)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:128)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2308)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:87)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2342)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2324)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:351)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:163)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:335)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:194)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:368)
似乎是库版本不匹配问题..我该如何解决。