1

我正在尝试生成连续市场数据的聚合视图,这意味着我们需要每 2 条消息计算总和值。说数据进来:

(V0,T0),(V1,T1),(V2,T2),(V3,T3)....

V表示值T表示我们接收数据时的时间戳。

我们需要为每 2 个点生成总和,例如:

(R1=Sum(V0,V1),T1),(R2=Sum(V1,V2),T2),(R3=Sum(V2,V3),T3),....

有什么建议我们如何通过使用来做到这一点,aggregator2或者我们需要为此编写一个处理器吗?

4

2 回答 2

1

You are right, aggregator2 component is the good way to go. I would try something like that:

from("somewhere").split(body().tokenize("),")).streaming()
    .aggregate(new ValueAggregationStrategy()).completionTimeout(1500)
    .to("whatYouWant");

class ValueAggregationStrategy implements AggregationStrategy {

    public Exchange aggregate(Exchange oldExchange, Exchange newExchange) {
        if (oldExchange == null) {
            return newExchange;
        }

        String oldBody = oldExchange.getIn().getBody(String.class);
        String newBody = newExchange.getIn().getBody(String.class);

        oldExchange.getIn().setBody(extractValue(oldBody) + extractValue(newBody));
        return oldExchange;
    }

    public int extractValue(String body) {
        // Do the work "(V0,T0" -> "V0"
    }
}

NB: It would be easier to parse if you could have a format like that: V0,T0;V1,T1...

For more information: here is an article wrote by Claus Ibsen on parsing large file with Camel

于 2013-11-05T20:54:46.970 回答
1

看了Aggregator的源码,原来camel只将一条消息聚合到一个组,为此我们必须构建一个“聚合器”。这是代码:

public abstract class GroupingGenerator<I> implements Processor {
private final EvictingQueue<I> queue;
private final int size;

public int getSize() {
    return size;
}

public GroupingGenerator(int size) {
    super();
    this.size = size;
    this.queue = EvictingQueue.create(size);
}

@SuppressWarnings("unchecked")
@Override
public void process(Exchange exchange) throws Exception {
    queue.offer((I) exchange.getIn().getBody());
    if (queue.size() != size) {
        exchange.setProperty(Exchange.ROUTE_STOP, true);
        return;
    } else {
        processGroup(queue, exchange);
    }
}

protected abstract void processGroup(Collection<I> items, Exchange exchange);

}
于 2013-11-14T14:40:01.783 回答