2

我正在尝试创建具有多核的 Solr 机器。目前,我的问题是执行初始完全导入。第一个核心完美导入。但是,当我尝试导入第二个核心时,出现以下错误:

org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@/home/solr/solr/data/index/write.lock
        at org.apache.lucene.store.Lock.obtain(Lock.java:84)
        at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:602)
        at org.apache.solr.update.SolrIndexWriter.<init>(SolrIndexWriter.java:75)
        at org.apache.solr.update.SolrIndexWriter.create(SolrIndexWriter.java:62)
        at org.apache.solr.update.DefaultSolrCoreState.createMainIndexWriter(DefaultSolrCoreState.java:191)
        at org.apache.solr.update.DefaultSolrCoreState.getIndexWriter(DefaultSolrCoreState.java:77)
        at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:157)
        at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:61)
        at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:51)
        at org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:432)
        at org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:557)
        at org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:325)
        at org.apache.solr.update.processor.LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:100)
        at org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:70)
        at org.apache.solr.handler.dataimport.DataImportHandler$1.upload(DataImportHandler.java:233)
        at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:507)
        at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:411)
        at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:326)
        at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:234)
        at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:382)
        at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:448)
        at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:429)
Feb 25, 2013 11:54:30 AM org.apache.solr.handler.dataimport.DocBuilder execute

这发生在第二个核心上,无论我从哪个核心开始。我在这里想念什么?每个核心都可以完美地独立运行。

4

1 回答 1

0

This is happening because the index dir for both cores is the same. As soon as you start a full import for one core, there is a write.lock file in that dir, which won't be deleted until the import completes. You anyway want to keep the indexes for the cores in separate dirs, so modify solrconfig.xml like this:

  <!-- Used to specify an alternate directory to hold all index data
       other than the default ./data under the Solr home.
       If replication is in use, this should match the replication 
       configuration. -->
  <dataDir>./data/${solr.core.name}</dataDir>
于 2013-02-26T05:15:52.113 回答