我基本上想复制这个命令:
storm jar target/crawlIndexer-1.0-SNAPSHOT.jar
org.apache.storm.flux.Flux es-crawler.flux --local --sleep 30000
但使其成为可执行类(类似于 ESCrawlToplogy)。但是让它本地化
到目前为止,我已经尝试过:
public class ESCrawlTopology extends ConfigurableTopology {
public static void main(String[] args) throws Exception {
ConfigurableTopology.start(new ESCrawlTopology(),
new String[]{"-conf", "crawler-conf.yaml","-local"}); //Added local flag
}
@Override
protected int run(String[] args) {
TopologyBuilder builder = new TopologyBuilder();
int numWorkers = ConfUtils.getInt(getConf(), "topology.workers", 1);
// set to the real number of shards ONLY if es.status.routing is set to
// true in the configuration
int numShards = 1;
builder.setSpout("spout", new CollapsingSpout(), numShards);
builder.setBolt("status_metrics", new StatusMetricsBolt())
.shuffleGrouping("spout");
builder.setBolt("partitioner", new URLPartitionerBolt(), numWorkers)
.shuffleGrouping("spout");
builder.setBolt("fetch", new FetcherBolt(), numWorkers).fieldsGrouping(
"partitioner", new Fields("key"));
builder.setBolt("sitemap", new SiteMapParserBolt(), numWorkers)
.localOrShuffleGrouping("fetch");
builder.setBolt("parse", new JSoupParserBolt(), numWorkers)
.localOrShuffleGrouping("sitemap");
builder.setBolt("indexer", new IndexerBolt(), numWorkers)
.localOrShuffleGrouping("parse");
Fields furl = new Fields("url");
builder.setBolt("status", new StatusUpdaterBolt(), numWorkers)
.fieldsGrouping("fetch", Constants.StatusStreamName, furl)
.fieldsGrouping("sitemap", Constants.StatusStreamName, furl)
.fieldsGrouping("parse", Constants.StatusStreamName, furl)
.fieldsGrouping("indexer", Constants.StatusStreamName, furl);
builder.setBolt("deleter", new DeletionBolt(), numWorkers)
.localOrShuffleGrouping("status",
Constants.DELETION_STREAM_NAME);
conf.registerMetricsConsumer(MetricsConsumer.class);
conf.registerMetricsConsumer(LoggingMetricsConsumer.class);
return submit("crawl", conf, builder);
}
我所做的主要更改是将“-local”标志作为参数添加到 main 方法。
上面似乎成功地在本地加载了风暴,但是我在 ElasticSearch 中遇到错误。
org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []
at org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:344) ~[elasticsearch-5.3.0.jar:5.3.0]
at org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:242) ~[elasticsearch-5.3.0.jar:5.3.0]
at org.elasticsearch.client.transport.TransportProxyClient.execute(TransportProxyClient.java:59) ~[elasticsearch-5.3.0.jar:5.3.0]
at org.elasticsearch.client.transport.TransportClient.doExecute(TransportClient.java:366) ~[elasticsearch-5.3.0.jar:5.3.0]
at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:404) ~[elasticsearch-5.3.0.jar:5.3.0]
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:80) ~[elasticsearch-5.3.0.jar:5.3.0]
at com.digitalpebble.stormcrawler.elasticsearch.persistence.CollapsingSpout.populateBuffer(CollapsingSpout.java:140) ~[storm-crawler-elasticsearch-1.5.jar:?]
at com.digitalpebble.stormcrawler.elasticsearch.persistence.AbstractSpout.nextTuple(AbstractSpout.java:322) ~[storm-crawler-elasticsearch-1.5.jar:?]
at org.apache.storm.daemon.executor$fn__4976$fn__4991$fn__5022.invoke(executor.clj:644) ~[storm-core-1.1.0.jar:1.1.0]
at org.apache.storm.util$async_loop$fn__557.invoke(util.clj:484) [storm-core-1.1.0.jar:1.1.0]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_151]
有任何想法吗?谢谢