我有一个看起来像这样的网络摄像头:
$ ffmpeg -f v4l2 -list_formats all -i /dev/video0
[video4linux2,v4l2 @ 0x55fe6d48a240] Compressed: mjpeg : Motion-JPEG : 1920x1080 1280x720 640x480 320x240
[video4linux2,v4l2 @ 0x55fe6d48a240] Raw : yuyv422 : YUYV 4:2:2 : 640x480 320x240
$ v4l2-ctl --device /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT Type: Video Capture
[0]: 'MJPG' (Motion-JPEG, compressed)
Size: Discrete 1920x1080
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 320x240
Interval: Discrete 0.033s (30.000 fps)
[1]: 'YUYV' (YUYV 4:2:2)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 320x240 Interval: Discrete 0.033s (30.000 fps)
我创建了一个这样的虚拟视频设备:
modprobe v4l2loopback video_nr=20
我尝试使用 ffmpeg 转发流:
$ ffmpeg -pix_fmt yuv420p -f v4l2 -input_format mjpeg -s 1280x720 -i /dev/video0 -f v4l2 /dev/video20
ffmpeg version 4.2.4-1ubuntu0.1 Copyright (c) 2000-2020 the FFmpeg developers
built with gcc 9 (Ubuntu 9.3.0-10ubuntu2)
configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
libavutil 56. 31.100 / 56. 31.100
libavcodec 58. 54.100 / 58. 54.100
libavformat 58. 29.100 / 58. 29.100
libavdevice 58. 8.100 / 58. 8.100
libavfilter 7. 57.100 / 7. 57.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 5.100 / 5. 5.100
libswresample 3. 5.100 / 3. 5.100
libpostproc 55. 5.100 / 55. 5.100
[video4linux2,v4l2 @ 0x560fef7a8340] The V4L2 driver changed the video from 1280x720 to 640x480
Input #0, video4linux2,v4l2, from '/dev/video0':
Duration: N/A, start: 169134.649958, bitrate: 147456 kb/s
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> rawvideo (native))
Press [q] to stop, [?] for help
Output #0, video4linux2,v4l2, to '/dev/video20':
Metadata:
encoder : Lavf58.29.100
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, q=2-31, 147456 kb/s, 30 fps, 30 tbn, 30 tbc
Metadata:
encoder : Lavc58.54.100 rawvideo
frame= 109 fps= 32 q=-0.0 Lsize=N/A time=00:00:03.63 bitrate=N/A dup=4 drop=0 speed=1.06x
但是,我一直以看起来像这样的虚拟设备结束:
$ v4l2-ctl --device /dev/video20 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'YUYV' (YUYV 4:2:2)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
无论我在 ffmpeg 命令中指定什么,它似乎都只转发 YUYV 格式。如何使用 mjpg 转发我的网络摄像头,以便以更高分辨率使用虚拟设备?