6

I am building a PUB/SUB system backed by Redis.

I have one publisher and tons of subscribers. The subscribers are not that reliable, they can lose connection at any time and need to be able to "recover" from a lost connection.

There is a twist though, I would like my backlog capped at some number, meaning that a faulty subscriber should be able to recover only up to N messages.

The trivial solution is:

  1. Publisher publishes message X
  2. X is pushed onto a list RPUSH list message
  3. Message is encoded to include its index in the list
  4. Message is published to consumers (with the index embedded) PUBLISH channel encoded

If a consumer needs to re-establish:

  1. It asks the redis for all the values in the list after the index it has and executes a PSUBSCRIBE atomically

Up to here we are all good.

My big question is, what if I want the backlog list to be capped at N items?

Is there any way I can keep an ever increasing index AND a capped backlog in the list?

4

1 回答 1

6

这个怎么样?要发布消息,请执行

LPUSH list message
LTRIM list 0 N
INCR global_index
PUBLISH channel global_index

当通过 pub/sub 接收消息并启动时,客户端需要将其最新索引(也可以保存在 redis 或其他地方)与 global_index 进行比较,并从列表中读取 min(global_index - my_index, N) 消息赶上(基本上LRANGE list 0 (global_index - my_index))。

于 2012-12-13T23:07:28.810 回答