我想使用 HPROF 来分析我的 Hadoop 工作。问题是我得到TRACES
但文件中CPU SAMPLES
没有profile.out
。我在运行方法中使用的代码是:
/** Get configuration */
Configuration conf = getConf();
conf.set("textinputformat.record.delimiter","\n\n");
conf.setStrings("args", args);
/** JVM PROFILING */
conf.setBoolean("mapreduce.task.profile", true);
conf.set("mapreduce.task.profile.params", "-agentlib:hprof=cpu=samples," +
"heap=sites,depth=6,force=n,thread=y,verbose=n,file=%s");
conf.set("mapreduce.task.profile.maps", "0-2");
conf.set("mapreduce.task.profile.reduces", "");
/** Job configuration */
Job job = new Job(conf, "HadoopSearch");
job.setJarByClass(Search.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(NullWritable.class);
/** Set Mapper and Reducer, use identity reducer*/
job.setMapperClass(Map.class);
job.setReducerClass(Reducer.class);
/** Set input and output formats */
job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
/** Set input and output path */
FileInputFormat.addInputPath(job, new Path("/user/niko/16M"));
FileOutputFormat.setOutputPath(job, new Path(cmd.getOptionValue("output")));
job.waitForCompletion(true);
return 0;
我如何获得CPU SAMPLES
要写入输出的内容?
我也有 s trange 错误消息,stderr
但我认为它不相关,因为当分析设置为 false 或启用分析的代码被注释掉时,它也存在。错误是
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.impl.MetricsSystemImpl).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.