我已经使用 FSCrawler 成功地索引了一个 pdf,但我无法连接到 FSCrawler 的 REST 客户端来创建到 elasticsearch 的管道。这是我在 docker-compose 中的命令:
command: fscrawler fscrawler_rest
我可以使用我的 FSCrawler 作业名称的索引来查询 elasticsearch 并检索结果。然后,当我将--rest
标志添加到我的 docker-compose 命令时,我成功启动了 REST 客户端(尽管有一个我不明白的警告):
WARN [o.g.j.i.i.Providers] A provider fr.pilato.elasticsearch.crawler.fs.rest.UploadApi registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime.
Due to constraint configuration problems the provider fr.pilato.elasticsearch.crawler.fs.rest.UploadApi will be ignored.
INFO [f.p.e.c.f.r.RestServer] FS crawler Rest service started on [http://127.0.0.1:8080/fscrawler]
然后,当我尝试curl
使用或不使用斜杠时:curl -XGET "127.0.0.1:8080/fscrawler/"
我得到curl: (7) Failed to connect to 127.0.0.1 port 8080: Connection refused
新的 docker-compose 命令供参考:
command: fscrawler fscrawler_rest --loop 0 --rest debug
我似乎无法很好地调试它,因为 docker-compose 在容器运行时不允许 CLI 命令,但我不明白为什么我仍然可以使用http://localhost:9200/fscrawler_rest
.
FSCrawler 正在使用 elasticsearch,但 REST 服务似乎无法正常工作。有没有人成功使用 FSCrawler REST API?
编辑:
version: '3.6'
services:
postgres:
image: "postgres:12.1"
env_file:
- '.env'
ports:
- '127.0.0.1:5432:5432'
restart: "${DOCKER_RESTART_POLICY:-unless-stopped}"
stop_grace_period: "${DOCKER_STOP_GRACE_PERIOD:-3s}"
volumes:
- postgres:/var/lib/postgresql/data
networks:
- esnet
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.8.0
# build: ./es
container_name: elasticsearch
env_file:
- ".env"
depends_on:
- "postgres"
volumes:
- esdata:/usr/share/elasticsearch/data
environment:
- node.name=elasticsearch
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
- discovery.type=single-node
- network.host=0.0.0.0
- network.publish_host=0.0.0.0
- http.cors.enabled=true
- http.cors.allow-origin=*
- http.host=0.0.0.0
- transport.host=0.0.0.0
ulimits:
memlock:
soft: -1
hard: -1
ports:
- 9200:9200
- 9300:9300
networks:
- esnet
fscrawler:
# I have taken this docker image and updated to 2.7 snapshot: toto1310/fscrawler
build:
context: ${PWD}
dockerfile: Dockerfile-toto
container_name: fscrawler
depends_on:
- elasticsearch
restart: always
volumes:
- ${PWD}/config:/root/.fscrawler
- ${PWD}/data:/tmp/es
networks:
- esnet
environment:
- FS_URL=/tmp/es
- ELASTICSEARCH_URL=http://elasticsearch:9200
- ELASTICSEARCH_INDEX=fscrawler_rest
command: fscrawler fscrawler_rest --loop 0 --rest debug
volumes:
postgres:
esdata:
driver: local
networks:
esnet: