26

我无法在两台 couchdb 服务器之间复制,所以我想从一台服务器转储到文件并从文件加载到另一台服务器。

我用这个语句转储,它工作正常:

curl -X GET http://localhost:5984/<DATABASE_NAME>/_all_docs?include_docs=true > FILE.txt

但是当我使用这个语句加载时:

curl -d @FILE.txt -H “Content-Type: application/json” -X POST http://localhost:5984/<DATABASE_NAME>/_bulk_docs

它失败了:

curl: (6) Could not resolve host: application; Host not found {"error":"bad_content_type","reason":"Content-Type must be application/json"}

有任何想法吗?

4

6 回答 6

15
于 2013-06-13T12:54:57.710 回答
9

You can use the following command line to convert the output of the curl command to the “docs” structure that the _bulk_docs requires:

curl -X GET 'http://localhost:5984/mydatabase/_all_docs?include_docs=true' | jq '{"docs": [.rows[].doc]}' | jq 'del(.docs[]._rev)' > db.json

jq is the name of an excellent command line processor very useful (i.e. in this situation).

Hope it helps.

于 2016-05-18T08:34:43.577 回答
7
于 2012-07-25T04:38:09.440 回答
2

As alternative solution, you may use couchdb-load and couchdb-dump utilities from couchdb-python project.

于 2012-07-25T09:52:23.020 回答
2

Nolan from the PouchDB team makes some great tools. These will work well to dump and load from CouchDB (including attachments):

Dump/Backup:

https://github.com/nolanlawson/pouchdb-dump-cli

Load/Restore:

https://github.com/nolanlawson/pouchdb-load

于 2016-06-23T07:31:29.810 回答
0

There's also github.com/danielebailo/couchdb-dump , which might help to clean out old transients, the authors state:

We've seen 15GB database files, containing only 2.1GB of raw JSON, reduced to 2.5GB on disk after import!

There are also hints on how update_seq works if you want do sync clusters.

于 2022-02-06T13:11:08.690 回答