1

我通过async包对提交进行排队,通过 node-sdk 调用我的链代码。但是,我经常得到一个error: [EventService]: EventService[org1-peer1] timed out after:3000. 对等方的日志中没有其他信息。不过,交易正在处理中。

这是客户端日志(GRPC_VERBOSITY=DEBUG,GRPC_TRACE=all):

--> Starting task
2020-08-13T08:49:36.081Z | call_stream | [2] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.083Z | call_stream | [2] receive HTTP/2 data frame of length 1476
2020-08-13T08:49:36.083Z | call_stream | [2] parsed message of length 1476
2020-08-13T08:49:36.084Z | call_stream | [2] filterReceivedMessage of length 1476
2020-08-13T08:49:36.084Z | call_stream | [2] pushing to reader message of length 1471
2020-08-13T08:49:36.088Z | call_stream | [2] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.088Z | call_stream | [2] received status code 0 from server
2020-08-13T08:49:36.088Z | call_stream | [2] received status details string "" from server
2020-08-13T08:49:36.089Z | call_stream | [2] ended with status: code=0 details=""
2020-08-13T08:49:36.090Z | subchannel | 10.100.136.32:30171 callRefcount 3 -> 2
2020-08-13T08:49:36.090Z | call_stream | [2] HTTP/2 stream closed with code 0
2020-08-13T08:49:36.098Z | call_stream | [8] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.099Z | call_stream | [8] receive HTTP/2 data frame of length 1476
2020-08-13T08:49:36.099Z | call_stream | [8] parsed message of length 1476
2020-08-13T08:49:36.099Z | call_stream | [8] filterReceivedMessage of length 1476
2020-08-13T08:49:36.099Z | call_stream | [8] pushing to reader message of length 1471
2020-08-13T08:49:36.100Z | call_stream | [8] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.100Z | call_stream | [8] received status code 0 from server
2020-08-13T08:49:36.100Z | call_stream | [8] received status details string "" from server
2020-08-13T08:49:36.100Z | call_stream | [8] ended with status: code=0 details=""
2020-08-13T08:49:36.101Z | subchannel | 10.100.136.32:30171 callRefcount 2 -> 1
2020-08-13T08:49:36.101Z | call_stream | [8] HTTP/2 stream closed with code 0
2020-08-13T08:49:36.101Z | call_stream | [5] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.101Z | call_stream | [5] receive HTTP/2 data frame of length 1476
2020-08-13T08:49:36.101Z | call_stream | [5] parsed message of length 1476
2020-08-13T08:49:36.101Z | call_stream | [5] filterReceivedMessage of length 1476
2020-08-13T08:49:36.102Z | call_stream | [5] pushing to reader message of length 1471
2020-08-13T08:49:36.102Z | call_stream | [5] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.102Z | call_stream | [5] received status code 0 from server
2020-08-13T08:49:36.102Z | call_stream | [5] received status details string "" from server
2020-08-13T08:49:36.102Z | call_stream | [5] ended with status: code=0 details=""
2020-08-13T08:49:36.102Z | subchannel | 10.100.136.32:30171 callRefcount 1 -> 0
2020-08-13T08:49:36.102Z | call_stream | [5] HTTP/2 stream closed with code 0
2020-08-13T08:49:36.103Z | call_stream | [6] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.103Z | call_stream | [6] receive HTTP/2 data frame of length 1476
2020-08-13T08:49:36.103Z | call_stream | [6] parsed message of length 1476
2020-08-13T08:49:36.103Z | call_stream | [6] filterReceivedMessage of length 1476
2020-08-13T08:49:36.103Z | call_stream | [6] pushing to reader message of length 1471
2020-08-13T08:49:36.103Z | call_stream | [6] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.103Z | call_stream | [6] received status code 0 from server
2020-08-13T08:49:36.103Z | call_stream | [6] received status details string "" from server
2020-08-13T08:49:36.103Z | call_stream | [6] ended with status: code=0 details=""
2020-08-13T08:49:36.104Z | subchannel | 10.100.136.32:30151 callRefcount 3 -> 2
2020-08-13T08:49:36.104Z | call_stream | [6] HTTP/2 stream closed with code 0
2020-08-13T08:49:36.104Z | call_stream | [0] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.104Z | call_stream | [0] receive HTTP/2 data frame of length 1477
2020-08-13T08:49:36.104Z | call_stream | [0] parsed message of length 1477
2020-08-13T08:49:36.104Z | call_stream | [0] filterReceivedMessage of length 1477
2020-08-13T08:49:36.104Z | call_stream | [0] pushing to reader message of length 1472
2020-08-13T08:49:36.104Z | call_stream | [0] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.104Z | call_stream | [0] received status code 0 from server
2020-08-13T08:49:36.104Z | call_stream | [0] received status details string "" from server
2020-08-13T08:49:36.105Z | call_stream | [0] ended with status: code=0 details=""
2020-08-13T08:49:36.105Z | subchannel | 10.100.136.32:30151 callRefcount 2 -> 1
2020-08-13T08:49:36.105Z | call_stream | [0] HTTP/2 stream closed with code 0
2020-08-13T08:49:36.105Z | call_stream | [3] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.105Z | call_stream | [3] receive HTTP/2 data frame of length 1476
2020-08-13T08:49:36.105Z | call_stream | [3] parsed message of length 1476
2020-08-13T08:49:36.105Z | call_stream | [3] filterReceivedMessage of length 1476
2020-08-13T08:49:36.106Z | call_stream | [3] pushing to reader message of length 1471
2020-08-13T08:49:36.106Z | call_stream | [3] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.106Z | call_stream | [3] received status code 0 from server
2020-08-13T08:49:36.106Z | call_stream | [3] received status details string "" from server
2020-08-13T08:49:36.106Z | call_stream | [3] ended with status: code=0 details=""
2020-08-13T08:49:36.106Z | subchannel | 10.100.136.32:30151 callRefcount 1 -> 0
2020-08-13T08:49:36.106Z | call_stream | [3] HTTP/2 stream closed with code 0
2020-08-13T08:49:36.107Z | call_stream | [7] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.107Z | call_stream | [7] receive HTTP/2 data frame of length 1477
2020-08-13T08:49:36.107Z | call_stream | [7] parsed message of length 1477
2020-08-13T08:49:36.107Z | call_stream | [7] filterReceivedMessage of length 1477
2020-08-13T08:49:36.107Z | call_stream | [7] pushing to reader message of length 1472
2020-08-13T08:49:36.107Z | call_stream | [7] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.107Z | call_stream | [7] received status code 0 from server
2020-08-13T08:49:36.107Z | call_stream | [7] received status details string "" from server
2020-08-13T08:49:36.108Z | call_stream | [7] ended with status: code=0 details=""
2020-08-13T08:49:36.108Z | subchannel | 10.100.136.32:30161 callRefcount 3 -> 2
2020-08-13T08:49:36.108Z | call_stream | [7] HTTP/2 stream closed with code 0
2020-08-13T08:49:36.116Z | resolving_load_balancer | dns:worker2.example.com:30151 IDLE -> IDLE
2020-08-13T08:49:36.116Z | connectivity_state | dns:worker2.example.com:30151 IDLE -> IDLE
2020-08-13T08:49:36.116Z | dns_resolver | Resolver constructed for target dns:worker2.example.com:30151
2020-08-13T08:49:36.117Z | call_stream | [1] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.117Z | call_stream | [1] receive HTTP/2 data frame of length 1477
2020-08-13T08:49:36.117Z | call_stream | [1] parsed message of length 1477
2020-08-13T08:49:36.117Z | call_stream | [1] filterReceivedMessage of length 1477
2020-08-13T08:49:36.117Z | call_stream | [1] pushing to reader message of length 1472
2020-08-13T08:49:36.117Z | call_stream | [1] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.117Z | call_stream | [1] received status code 0 from server
2020-08-13T08:49:36.118Z | call_stream | [1] received status details string "" from server
2020-08-13T08:49:36.118Z | call_stream | [1] ended with status: code=0 details=""
2020-08-13T08:49:36.118Z | subchannel | 10.100.136.32:30161 callRefcount 2 -> 1
2020-08-13T08:49:36.118Z | call_stream | [1] HTTP/2 stream closed with code 0
2020-08-13T08:49:36.122Z | dns_resolver | Resolution update requested for target dns:worker2.example.com:30151
2020-08-13T08:49:36.122Z | resolving_load_balancer | dns:worker2.example.com:30151 IDLE -> CONNECTING
2020-08-13T08:49:36.122Z | connectivity_state | dns:worker2.example.com:30151 IDLE -> CONNECTING
2020-08-13T08:49:36.122Z | resolving_load_balancer | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.122Z | connectivity_state | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.132Z | dns_resolver | Resolved addresses for target dns:worker2.example.com:30151: [10.100.136.32:30151]
2020-08-13T08:49:36.132Z | pick_first | IDLE -> IDLE
2020-08-13T08:49:36.132Z | resolving_load_balancer | dns:worker2.example.com:30151 CONNECTING -> IDLE
2020-08-13T08:49:36.132Z | connectivity_state | dns:worker2.example.com:30151 CONNECTING -> IDLE
2020-08-13T08:49:36.132Z | dns_resolver | Resolution update requested for target dns:worker2.example.com:30151
2020-08-13T08:49:36.132Z | resolving_load_balancer | dns:worker2.example.com:30151 IDLE -> CONNECTING
2020-08-13T08:49:36.132Z | connectivity_state | dns:worker2.example.com:30151 IDLE -> CONNECTING
2020-08-13T08:49:36.133Z | resolving_load_balancer | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.133Z | connectivity_state | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.133Z | pick_first | Connect to address list 10.100.136.32:30151
2020-08-13T08:49:36.134Z | subchannel | 10.100.136.32:30151 refcount 0 -> 1
2020-08-13T08:49:36.135Z | subchannel | 10.100.136.32:30151 refcount 1 -> 2
2020-08-13T08:49:36.135Z | pick_first | Start connecting to subchannel with address 10.100.136.32:30151
2020-08-13T08:49:36.135Z | pick_first | IDLE -> CONNECTING
2020-08-13T08:49:36.135Z | resolving_load_balancer | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.135Z | connectivity_state | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.135Z | subchannel | 10.100.136.32:30151 IDLE -> CONNECTING
2020-08-13T08:49:36.135Z | pick_first | CONNECTING -> CONNECTING
2020-08-13T08:49:36.135Z | resolving_load_balancer | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.135Z | connectivity_state | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.136Z | call_stream | [4] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.136Z | call_stream | [4] receive HTTP/2 data frame of length 1477
2020-08-13T08:49:36.136Z | call_stream | [4] parsed message of length 1477
2020-08-13T08:49:36.137Z | call_stream | [4] filterReceivedMessage of length 1477
2020-08-13T08:49:36.137Z | call_stream | [4] pushing to reader message of length 1472
2020-08-13T08:49:36.137Z | call_stream | [4] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.137Z | call_stream | [4] received status code 0 from server
2020-08-13T08:49:36.137Z | call_stream | [4] received status details string "" from server
2020-08-13T08:49:36.137Z | call_stream | [4] ended with status: code=0 details=""
2020-08-13T08:49:36.137Z | subchannel | 10.100.136.32:30161 callRefcount 1 -> 0
2020-08-13T08:49:36.137Z | call_stream | [4] HTTP/2 stream closed with code 0
2020-08-13T08:49:36.142Z | dns_resolver | Resolved addresses for target dns:worker2.example.com:30151: [10.100.136.32:30151]
2020-08-13T08:49:36.142Z | subchannel | 10.100.136.32:30151 refcount 2 -> 1
2020-08-13T08:49:36.142Z | pick_first | Connect to address list 10.100.136.32:30151
2020-08-13T08:49:36.142Z | subchannel | 10.100.136.32:30151 refcount 1 -> 2
2020-08-13T08:49:36.142Z | pick_first | CONNECTING -> CONNECTING
2020-08-13T08:49:36.143Z | resolving_load_balancer | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.143Z | connectivity_state | dns:worker2.example.com:30151 CONNECTING -> CONNECTING
2020-08-13T08:49:36.237Z | subchannel | 10.100.136.32:30151 CONNECTING -> READY
2020-08-13T08:49:36.237Z | pick_first | Pick subchannel with address 10.100.136.32:30151
2020-08-13T08:49:36.237Z | pick_first | CONNECTING -> READY
2020-08-13T08:49:36.237Z | resolving_load_balancer | dns:worker2.example.com:30151 CONNECTING -> READY
2020-08-13T08:49:36.237Z | connectivity_state | dns:worker2.example.com:30151 CONNECTING -> READY
2020-08-13T08:49:36.237Z | subchannel | 10.100.136.32:30151 refcount 2 -> 3
2020-08-13T08:49:36.237Z | subchannel | 10.100.136.32:30151 refcount 3 -> 2
2020-08-13T08:49:36.239Z | channel | dns:worker2.example.com:30151 createCall [9] method="/protos.Deliver/DeliverFiltered", deadline=Infinity
2020-08-13T08:49:36.239Z | call_stream | [9] Sending metadata
2020-08-13T08:49:36.239Z | channel | Pick result: COMPLETE subchannel: 10.100.136.32:30151 status: undefined undefined
2020-08-13T08:49:36.240Z | call_stream | [9] write() called with message of length 1111
2020-08-13T08:49:36.241Z | channel | dns:worker2.example.com:30151 createCall [10] method="/protos.Deliver/DeliverFiltered", deadline=Infinity
2020-08-13T08:49:36.241Z | call_stream | [10] Sending metadata
2020-08-13T08:49:36.241Z | channel | Pick result: COMPLETE subchannel: 10.100.136.32:30151 status: undefined undefined
2020-08-13T08:49:36.241Z | call_stream | [10] write() called with message of length 1110
2020-08-13T08:49:36.241Z | call_stream | [9] deferring writing data chunk of length 1116
2020-08-13T08:49:36.241Z | call_stream | [10] deferring writing data chunk of length 1115
2020-08-13T08:49:36.241Z | subchannel | Starting stream with headers
                grpc-accept-encoding: identity,deflate,gzip
                accept-encoding: identity,gzip
                :authority: worker2.example.com
                user-agent: grpc-node-js/1.0.3
                content-type: application/grpc
                :method: POST
                :path: /protos.Deliver/DeliverFiltered
                te: trailers

2020-08-13T08:49:36.241Z | call_stream | [9] attachHttp2Stream from subchannel 10.100.136.32:30151
2020-08-13T08:49:36.241Z | subchannel | 10.100.136.32:30151 callRefcount 0 -> 1
2020-08-13T08:49:36.241Z | call_stream | [9] sending data chunk of length 1116 (deferred)
2020-08-13T08:49:36.242Z | subchannel | Starting stream with headers
                grpc-accept-encoding: identity,deflate,gzip
                accept-encoding: identity,gzip
                :authority: worker2.example.com
                user-agent: grpc-node-js/1.0.3
                content-type: application/grpc
                :method: POST
                :path: /protos.Deliver/DeliverFiltered
                te: trailers

2020-08-13T08:49:36.242Z | call_stream | [10] attachHttp2Stream from subchannel 10.100.136.32:30151
2020-08-13T08:49:36.242Z | subchannel | 10.100.136.32:30151 callRefcount 1 -> 2
2020-08-13T08:49:36.242Z | call_stream | [10] sending data chunk of length 1115 (deferred)
2020-08-13T08:49:36.242Z | channel | dns:worker2.example.com:30151 createCall [11] method="/protos.Deliver/DeliverFiltered", deadline=Infinity
2020-08-13T08:49:36.243Z | call_stream | [11] Sending metadata
2020-08-13T08:49:36.243Z | channel | Pick result: COMPLETE subchannel: 10.100.136.32:30151 status: undefined undefined
2020-08-13T08:49:36.243Z | call_stream | [11] write() called with message of length 1111
2020-08-13T08:49:36.243Z | call_stream | [11] deferring writing data chunk of length 1116
2020-08-13T08:49:36.243Z | subchannel | Starting stream with headers
                grpc-accept-encoding: identity,deflate,gzip
                accept-encoding: identity,gzip
                :authority: worker2.example.com
                user-agent: grpc-node-js/1.0.3
                content-type: application/grpc
                :method: POST
                :path: /protos.Deliver/DeliverFiltered
                te: trailers

2020-08-13T08:49:36.243Z | call_stream | [11] attachHttp2Stream from subchannel 10.100.136.32:30151
2020-08-13T08:49:36.243Z | subchannel | 10.100.136.32:30151 callRefcount 2 -> 3
2020-08-13T08:49:36.243Z | call_stream | [11] sending data chunk of length 1116 (deferred)
2020-08-13T08:49:36.273Z | call_stream | [9] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.273Z | call_stream | [9] receive HTTP/2 data frame of length 164
2020-08-13T08:49:36.274Z | call_stream | [9] parsed message of length 164
2020-08-13T08:49:36.274Z | call_stream | [9] filterReceivedMessage of length 164
2020-08-13T08:49:36.274Z | call_stream | [9] pushing to reader message of length 159
2020-08-13T08:49:36.279Z | call_stream | [11] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.279Z | call_stream | [11] receive HTTP/2 data frame of length 164
2020-08-13T08:49:36.279Z | call_stream | [11] parsed message of length 164
2020-08-13T08:49:36.279Z | call_stream | [11] filterReceivedMessage of length 164
2020-08-13T08:49:36.279Z | call_stream | [11] pushing to reader message of length 159
2020-08-13T08:49:36.286Z | channel | dns:worker2.example.com:30011 createCall [12] method="/orderer.AtomicBroadcast/Broadcast", deadline=Infinity
2020-08-13T08:49:36.286Z | call_stream | [12] Sending metadata
2020-08-13T08:49:36.286Z | channel | Pick result: COMPLETE subchannel: 10.100.136.32:30011 status: undefined undefined
2020-08-13T08:49:36.287Z | call_stream | [12] write() called with message of length 5483
2020-08-13T08:49:36.287Z | call_stream | [12] deferring writing data chunk of length 5488
2020-08-13T08:49:36.287Z | subchannel | Starting stream with headers
                grpc-accept-encoding: identity,deflate,gzip
                accept-encoding: identity,gzip
                :authority: worker2.example.com
                user-agent: grpc-node-js/1.0.3
                content-type: application/grpc
                :method: POST
                :path: /orderer.AtomicBroadcast/Broadcast
                te: trailers

2020-08-13T08:49:36.287Z | call_stream | [12] attachHttp2Stream from subchannel 10.100.136.32:30011
2020-08-13T08:49:36.287Z | subchannel | 10.100.136.32:30011 callRefcount 0 -> 1
2020-08-13T08:49:36.287Z | call_stream | [12] sending data chunk of length 5488 (deferred)
2020-08-13T08:49:36.287Z | call_stream | [10] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.287Z | call_stream | [10] receive HTTP/2 data frame of length 164
2020-08-13T08:49:36.287Z | call_stream | [10] parsed message of length 164
2020-08-13T08:49:36.287Z | call_stream | [10] filterReceivedMessage of length 164
2020-08-13T08:49:36.288Z | call_stream | [10] pushing to reader message of length 159
2020-08-13T08:49:36.315Z | call_stream | [12] Received server headers:
                :status: 200
                content-type: application/grpc

2020-08-13T08:49:36.315Z | call_stream | [12] receive HTTP/2 data frame of length 8
2020-08-13T08:49:36.316Z | call_stream | [12] parsed message of length 8
2020-08-13T08:49:36.316Z | call_stream | [12] filterReceivedMessage of length 8
2020-08-13T08:49:36.316Z | call_stream | [12] pushing to reader message of length 3
2020-08-13T08:49:36.319Z | call_stream | [12] end() called
2020-08-13T08:49:36.319Z | call_stream | [12] calling end() on HTTP/2 stream
2020-08-13T08:49:36.351Z | call_stream | [12] Received server trailers:
                grpc-status: 0
                grpc-message:

2020-08-13T08:49:36.351Z | call_stream | [12] received status code 0 from server
2020-08-13T08:49:36.352Z | call_stream | [12] received status details string "" from server
2020-08-13T08:49:36.352Z | call_stream | [12] ended with status: code=0 details=""
2020-08-13T08:49:36.352Z | subchannel | 10.100.136.32:30011 callRefcount 1 -> 0
2020-08-13T08:49:36.352Z | call_stream | [12] cancelWithStatus code: 1 details: "Cancelled on client"
2020-08-13T08:49:36.352Z | call_stream | [12] ended with status: code=1 details="Cancelled on client"
2020-08-13T08:49:36.352Z | call_stream | [12] HTTP/2 stream closed with code 0
2020-08-13T08:49:38.336Z | call_stream | [10] receive HTTP/2 data frame of length 91
2020-08-13T08:49:38.336Z | call_stream | [10] parsed message of length 91
2020-08-13T08:49:38.336Z | call_stream | [10] filterReceivedMessage of length 91
2020-08-13T08:49:38.336Z | call_stream | [10] pushing to reader message of length 86
2020-08-13T08:49:38.336Z | call_stream | [9] receive HTTP/2 data frame of length 91
2020-08-13T08:49:38.336Z | call_stream | [9] parsed message of length 91
2020-08-13T08:49:38.336Z | call_stream | [9] filterReceivedMessage of length 91
2020-08-13T08:49:38.336Z | call_stream | [9] pushing to reader message of length 86
2020-08-13T08:49:38.337Z | call_stream | [11] receive HTTP/2 data frame of length 91
2020-08-13T08:49:38.337Z | call_stream | [11] parsed message of length 91
2020-08-13T08:49:38.337Z | call_stream | [11] filterReceivedMessage of length 91
2020-08-13T08:49:38.337Z | call_stream | [11] pushing to reader message of length 86
2020-08-13T08:49:39.240Z - error: [EventService]: EventService[org1-peer1] timed out after:3000
2020-08-13T08:49:39.241Z - error: [EventService]: send[org1-peer1] - #1 - Starting stream to org1-peer1 failed

客户端代码:

'use strict';
const FabricCAServices = require('fabric-ca-client');
const { Wallets, Gateway } = require('fabric-network');
const fs = require('fs');
const path = require('path');
const yaml = require('js-yaml');
const async = require("async");

const user = 'benchmark';
const userpw = 'benchmarkPW';
const mspID = 'Org1MSP';
const BENCHMARK_CONCURRENCY = 1;
const BENCHMARK_OPERATIONS = 3;

const txQueue = async.queue(async (task) => {
  console.log('--> Starting task');
  await task;
}, BENCHMARK_CONCURRENCY);

const connectionPath = path.join(process.cwd(), 'gateway/connection.yaml');
const data = fs.readFileSync(connectionPath);
const ccp = yaml.safeLoad(data);

async function createWallet() {
  try {
    const walletPath = path.join(process.cwd(), 'identity/wallet');
    const wallet = await Wallets.newFileSystemWallet(walletPath);
    return wallet;
  } catch (error) {
    console.error(`Error: ${error}`);
  }
}

async function enrollUser(wallet) {
  try {
    // Create a new CA client for interacting with the CA.
    const caTLSCACerts = ccp.certificateAuthorities['org1-ca'].tlsCACerts.pem;
    const caUrl = ccp.certificateAuthorities['org1-ca'].url;
    const ca = new FabricCAServices(caUrl, { trustedRoots: caTLSCACerts, verify: false }, 'CA');

    // Check to see if we've already enrolled the user.
    const userExists = await wallet.get(user);
    if (userExists) {
      console.log(`An identity for the client user "${user}" already exists in the wallet`);
    } else {
      // Enroll signing material
      let enrollment = await ca.enroll({ enrollmentID: user, enrollmentSecret: userpw });
      let x509Identity = {
        credentials: {
          certificate: enrollment.certificate,
          privateKey: enrollment.key.toBytes(),
        },
        mspId: mspID,
        type: 'X.509',
      };
      await wallet.put(user, x509Identity);
      console.log(`Successfully enrolled msp for user "${user}" and imported it into the wallet`);

      const tlsca = new FabricCAServices(caUrl, { trustedRoots: caTLSCACerts, verify: false }, `TLSCA`);
      enrollment = await tlsca.enroll({ enrollmentID: user, enrollmentSecret: userpw, profile: 'tls' });
      x509Identity = {
        credentials: {
          certificate: enrollment.certificate,
          privateKey: enrollment.key.toBytes(),
        },
        mspId: mspID,
        type: 'X.509',
      };
      await wallet.put(`${user}-tls`, x509Identity);
      console.log(`Successfully enrolled tls-msp for user "${user}" and imported it into the wallet`);
    }
  } catch (error) {
    console.error(`Error enrolling user "${user}": ${error}`);
    process.exit(1);
  }
}

async function startBenchmark(wallet) {
  try {
    const gateway = new Gateway();

    let connectionOptions = {
      identity: `${user}`,
      clientTlsIdentity: `${user}-tls`,
      wallet: wallet,
      discovery: { enabled: false, asLocalhost: false },
    };

    await gateway.connect(ccp, connectionOptions);

    const network = await gateway.getNetwork('channel1');
    const contract = network.getContract('cc-abac');

    let startTime = 0;
    let left = BENCHMARK_OPERATIONS;

    startTime = Date.now();
    while (left > 0) {
      txQueue.push(contract.submitTransaction('invoke', 'a', 'b', '1'));
      left--;
    }

    await txQueue.drain();

    let endTime = Date.now();
    let duration = (endTime - startTime) / 1000;
    let tps = BENCHMARK_OPERATIONS / duration;
    console.log('======== Benchmark finished ========');
    console.log(`Processed ${BENCHMARK_OPERATIONS} ops in ${duration} seconds. TPS: ${tps}`);

    gateway.disconnect();
    process.exit(0);
  } catch (error) {
    console.error(`Got error:": ${error}`);
    process.exit(1);
  }
}

async function main() {
  try {
    const wallet = await createWallet();
    await enrollUser(wallet);
    await startBenchmark(wallet);
  } catch (error) {
    console.error(`Error: ${error}`);
    process.exit(1);
  }
}

main();

连接配置文件:

name: "Network"
version: "1.1"

client:
  organization: Org1

  connection:
    timeout:
      peer:
        endorser: 120
        eventHub: 60
        eventReg: 30
      orderer: 30
    options:
      grpc.keepalive_timeout_ms: 10000

channels:
  channel1:
    orderers:
      - org1-orderer
      - org2-orderer
      - org3-orderer

    peers:
      - org1-peer1
      - org2-peer1
      - org3-peer1

organizations:
  Org1:
    mspid: Org1MSP
    peers:
      - org1-peer1
    certificateAuthorities:
      - org1-ca

  Org2:
    mspid: Org2MSP
    peers:
      - org2-peer1
    certificateAuthorities:
      - org2-ca

  Org3:
    mspid: Org3MSP
    peers:
      - org3-peer1
    certificateAuthorities:
      - org3-ca

certificateAuthorities:
  org1-ca:
    url: https://worker2.example.com:30051
    tlsCACerts:
      path: "./crypto/org1-tls-ca-ca-cert.pem"
    httpOptions:
      verify: false
    caName: CA

  org2-ca:
    url: https://worker2.example.com:30052
    tlsCACerts:
      path: "./crypto/org2-tls-ca-ca-cert.pem"
    httpOptions:
      verify: false
    caName: CA

  org3-ca:
    url: https://worker2.example.com:30053
    tlsCACerts:
      path: "./crypto/org3-tls-ca-ca-cert.pem"
    httpOptions:
      verify: false
    caName: CA

orderers:
  org1-orderer:
    url: grpcs://worker2.example.com:30011
    grpcOptions:
      ssl-target-name-override: worker2.example.com
    tlsCACerts:
      path: "./crypto/org1-tls-ca-ca-cert.pem"

  org2-orderer:
    url: grpcs://worker2.example.com:30012
    grpcOptions:
      ssl-target-name-override: worker2.example.com
    tlsCACerts:
      path: "./crypto/org2-tls-ca-ca-cert.pem"

  org3-orderer:
    url: grpcs://worker2.example.com:30013
    grpcOptions:
      ssl-target-name-override: worker2.example.com
    tlsCACerts:
      path: "./crypto/org3-tls-ca-ca-cert.pem"

peers:
  org1-peer1:
    url: grpcs://worker2.example.com:30151
    httpOptions:
      verify: false
    grpcOptions:
      ssl-target-name-override: worker2.example.com
    tlsCACerts:
      path: "./crypto/org1-tls-ca-ca-cert.pem"

  org2-peer1:
    url: grpcs://worker2.example.com:30161
    httpOptions:
      verify: false
    grpcOptions:
      ssl-target-name-override: worker2.example.com
    tlsCACerts:
      path: "./crypto/org2-tls-ca-ca-cert.pem"

  org3-peer1:
    url: grpcs://worker2.example.com:30171
    httpOptions:
      verify: false
    grpcOptions:
      ssl-target-name-override: worker2.example.com
    tlsCACerts:
      path: "./crypto/org3-tls-ca-ca-cert.pem"

使用的版本:
HLF: 2.1
节点 SDK: 2.2
节点: 10.22

每个想法都受到高度赞赏!

4

1 回答 1

1

当连接事件服务以接收来自对等方的块事件后,如果在超时窗口内未收到任何事件,则会生成该错误。有可能这个对等点没有响应,但交易仍然可以流动,因为有足够的可联系对等点可以使背书成功。如果日志确认您的所有对等方都在正确通信并且您的事务都执行正常,那么这个错误可能是虚假的,您可以忽略它。

但是,即使错误日志是虚假的,如果它不存在就好了。可能是由于Gateway.disconnect()流程中的操作顺序而触发了不成功的重新连接,但我只是在这里猜测。可能值得在 Jira 中针对 Node SDK 提出错误报告以进行进一步调查:

https://jira.hyperledger.org/projects/FABN

于 2020-08-13T11:57:19.110 回答