1

I have a raw video file (testvideo_1000f.raw) that I am trying to stream in gray scale using ffmpeg and output the grayscale video to output.swf. The command I am using to do this is:

ffmpeg/ffmpeg -qmin 2 -qmax 31 -s 320x240 -f rawvideo -flags gray -pix_fmt:output gray -an -i testvideo_1000f.raw output.swf

However, the result from this command is a video stream that is in gray scale but still contains some of the chrominance data. The output from this command is pasted below:

    3 [volta]/home/student/elliott> ffmpeg/ffmpeg -qmin 2 -qmax 31 -s 320x240 -f rawvideo -flags gray -pix_fmt:output gray -an -i testvideo_1000f.raw output.swf
ffmpeg version N-41632-g2b1fc56 Copyright (c) 2000-2012 the FFmpeg developers
  built on Jul 29 2012 10:27:26 with gcc 4.1.2 20080704 (Red Hat 4.1.2-51)
  configuration: 
  libavutil      51. 58.100 / 51. 58.100
  libavcodec     54. 25.100 / 54. 25.100
  libavformat    54.  6.101 / 54.  6.101
  libavdevice    54.  0.100 / 54.  0.100
  libavfilter     2. 80.100 /  2. 80.100
  libswscale      2.  1.100 /  2.  1.100
  libswresample   0. 15.100 /  0. 15.100
*** CHOOSING 8
[rawvideo @ 0xdda9660] Estimating duration from bitrate, this may be inaccurate
Input #0, rawvideo, from 'testvideo_1000f.raw':
  Duration: N/A, start: 0.000000, bitrate: N/A
   Stream #0:0: Video: rawvideo (Y800 / 0x30303859), gray, 320x240, 25 tbr, 25 tbn, 25 tbc
File 'output.swf' already exists. Overwrite ? [y/N] y
w:320 h:240 pixfmt:gray tb:1/25 fr:25/1 sar:0/1 sws_param:flags=2
[ffmpeg_buffersink @ 0xddb7b40] No opaque field provided
[format @ 0xddb7d40] auto-inserting filter 'auto-inserted scaler 0' between the filter 'Parsed_null_0' and the filter 'format'
[auto-inserted scaler 0 @ 0xddb7920] w:320 h:240 fmt:gray sar:0/1 -> w:320 h:240 fmt:yuv420p sar:0/1 flags:0x4
*** CHOOSING 8
Output #0, swf, to 'output.swf':
  Metadata:
    encoder         : Lavf54.6.101
   Stream #0:0: Video: flv1, yuv420p, 320x240, q=2-31, 200 kb/s, 90k tbn, 25 tbc
Stream mapping:
  Stream #0:0 -> #0:0 (rawvideo -> flv)
Press [q] to stop, [?] for help
Truncating packet of size 76800 to 1 2875kB time=00:00:40.84 bitrate= 576.7kbits/s    
frame= 1500 fps=1035 q=24.8 Lsize=    4194kB time=00:01:00.00 bitrate= 572.6kbits/s    
video:4166kB audio:0kB global headers:0kB muxing overhead 0.669245%

I am fairly new to FFMPEG and I am afraid I am using either the wrong syntax or the wrong parameters in my command line. For some reason, the format of the output is yuv420p. I have tried searching for this answer all over but have had no luck. Could anyone please help me and tell me why the output is being formatted in yuv420p when I am giving the command for it to be in 8bit grayscale? Any help would be greatly appreciated. Thank you.

Marc Elliott

4

2 回答 2

1
ffmpeg -i VTS_05_1.VOB -pix_fmt gray -vcodec rawvideo -f yuv4mpegpipe - | ffmpeg -y -f yuv4mpegpipe -i - -vcodec libtheora out.avi
于 2015-01-21T12:20:29.533 回答
1

没有 ffmpeg 标志将允许您执行此操作。

视频格式是为 YUV 而设计的,而不仅仅是 Y。因此,如果不修改您的方法,您将无法做到这一点。您将不得不使用 mjpeg 来获取仅 Y 流。Mjpeg 支持 8 位输出,但我不认为 mjpeg 可以放在 swf 中。如果这足以满足您的目的,它可以进入 mp4 或 ts。

当然,另一种选择是解码/显示端只解码/显示亮度而不是色度。同样,这是一个自定义要求,不直接支持。

于 2012-09-09T03:52:52.617 回答