I wrote an android app that sends a live video stream from the camera over a socket to my computer. Is it possible to use FFmpeg to decode the MPEG4 video stream and some how display what the camera is seeing in real time? I'm guessing I would have to create a bitmap from the latest information it had from the byte stream and display it on the computer at 20+ FPS.
How would I go about doing something like this? C++, C# or Java is fine. From my understanding FFmpeg is written in C++