I'm trying to tell if a given photo is blurry. I know that this is basically an impossible task, and that any metric will return undesirable results sometimes.
I'm wondering if there's a simple metric that at least tries to estimate blur that I can use though. Specifically, the task has a high tolerance for false positives. e.g. If I got something that eliminated 90% of blurry photos and 50% of non-blurry photos I would be very happy.
I'm trying to implement this in Java. I have an array of pixels (as ints). Please keep in mind I have a limited understanding of image processing techniques (fourier transforms, etc.), and I would love a very specific walkthrough of how to code a solution.