0

I've been playing a bit with some image processing techniques to do HDR pictures and similar. I find it very hard to align pictures taken in bursts... I tried some naïve motion search algorithms, simply based on comparing small samples of pixels (like 16x16) between different pictures that pretty much work like: - select one 16x6 block in the first picture, one with high contrast, then blur it, to reduce noise - compare in a neighbouring radius (also blurred for noise)... (usually using averaged squared difference) - select the most similar one.

I tried a few things to improve this, for example using these search alghorithms (https://en.wikipedia.org/wiki/Block-matching_algorithm) to speed it up. The results however are not good and when they are they are not robust. Also they keep being computationally very intensive (which precludes usage on a mobile device for example).

I looked into popular research based algorithms like https://en.wikipedia.org/wiki/Lucas%E2%80%93Kanade_method, but it does not seem very suitable to big movements. If we see burst images taken with todays phones, that have sensors > 12Mpix, it's easy that small movements result in a difference of 50-100 pixels. The Lucas Kanade method seems more suitable for small amounts of motion...

It's a bit frustrating as there seem to be hundreds of apps that do HDR, and they seems to be able to match pictures so easily and reliably in a snap... I've tried to look into OpenCV, but all it offers seems to be the above Lucas Kanade method. Also I've seen projects like https://github.com/almalence/OpenCamera, which do this in pure java easily. Although the code is not easy (one class has 5k lines doing it all). Does anyone have any pointers to reliable resources.

4

1 回答 1

1

看看谷歌的HDR+ 论文。它使用分层算法进行对齐,速度非常快但不够稳健。之后,它使用一种对对齐失败具有鲁棒性的合并算法。

但是将它用于普通 HDR 可能有点棘手,因为它说:

我们捕捉持续曝光的帧,这使得对齐更加稳健。

是另一个需要亚像素精确对齐的工作。它使用了 HDR+ 论文中介绍的对齐方式的改进版本。

HDR+ 代码

于 2020-05-12T18:38:44.780 回答