9

我正在尝试使用 ffmpeg 从视频文件中捕获帧,但我什至无法获得视频的持续时间。每次当我尝试访问它时,pFormatCtx->duration 我都会得到 0。我知道指针已初始化并包含正确的持续时间,因为如果我使用 av_dump_format(pFormatCtx, 0, videoName, 0);,那么我实际上会获得持续时间数据以及有关视频的其他信息。这是我使用时得到的av_dump_format(pFormatCtx, 0, videoName, 0);

Input #0, avi, from 'futurama.avi':
Duration: 00:21:36.28, start: 0.000000, bitrate: 1135 kb/s
Stream #0.0: Video: mpeg4 (Advanced Simple Profile), yuv420p, 512x384
[PAR 1:1 DAR 4:3], 25 tbr, 25 tbn, 25 tbc
Stream #0.1: Audio: ac3, 48000 Hz, stereo, s16, 192 kb/s 

我不明白为什么av_dump_format可以显示持续时间而我不能。我检查了函数定义,显示持续时间,函数也使用pFormatCtx->duration. 当我调用它们时,不仅仅是持续时间其他成员变量也没有显示正确的数据main.cpp

这是我的代码:

extern "C" {
    #include<libavcodec/avcodec.h>
    #include<libavformat/avformat.h>
    #include<libswscale/swscale.h>
}

int main(int argc, char *argv[]) {
    AVFormatContext *pFormatCtx = NULL;

    const char videoName[] = "futurama.avi";

    // Register all formats and codecs.
    av_register_all();
    cout << "Opening the video file";
    // Open video file
    int ret = avformat_open_input(&pFormatCtx, videoName, NULL, NULL) != 0;
    if (ret != 0) {
        cout << "Couldn't open the video file." << ret ;
        return -1;
    }
    if(avformat_find_stream_info(pFormatCtx, 0) < 0) {
        cout << "problem with stream info";
        return -1;
    }

    av_dump_format(pFormatCtx, 0, videoName, 0);
    cout << pFormatCtx->bit_rate << endl; // different value each time, not initialized properly.
    cout << pFormatCtx->duration << endl; // 0
    return 0;
}

我不知道它是否有帮助,但我在 Ubuntu 上使用 QtCreator 并静态链接库。

4

4 回答 4

9

持续时间属性的time_base单位不是毫秒或秒。转换为毫秒非常简单,

double time_base =  (double)video_stream->time_base.num / (double)video_stream->time_base.den;
double duration = (double)video_stream->duration * time_base * 1000.0;

持续时间现在以毫秒为单位,只需使用 floor 或 ceil 即可获得整数 msec,无论您喜欢哪个。

于 2014-01-16T17:01:11.060 回答
3

av_open_input_file()和之间的区别avformat_open_input()可能在于后者不读取流信息 - 因此duration未初始化。打电话avformat_find_stream_info()为我解决了这个问题。

我从 http://ffmpeg.org/doxygen/trunk/dump_8c_source.html#l00480获取了计算/显示的代码片段(请注意,行号可以并且可能会在新版本中更改)。并添加了一些初始化代码,“它对我有用”。希望能帮助到你。

#include <libavutil/avutil.h>
#include <libavformat/avformat.h>

int main()
{
    const char const* file = "sample.mpg";
    AVFormatContext* formatContext = NULL;

    av_register_all();

    // Open video file
    avformat_open_input(&formatContext, file, NULL, NULL);
    avformat_find_stream_info(formatContext, NULL);

    // Lower log level since av_log() prints at AV_LOG_ERROR by default
    av_log_set_level(AV_LOG_INFO);

    av_log(NULL, AV_LOG_INFO, "  Duration: ");
    if (formatContext->duration != AV_NOPTS_VALUE) {
        int hours, mins, secs, us;
        int64_t duration = formatContext->duration + 5000;
        secs  = duration / AV_TIME_BASE;
        us    = duration % AV_TIME_BASE;
        mins  = secs / 60;   
        secs %= 60;          
        hours = mins / 60;   
        mins %= 60;
        av_log(NULL, AV_LOG_INFO, "%02d:%02d:%02d.%02d\n", hours, mins, secs, (100 * us) / AV_TIME_BASE);
    } 

    return 0;
}

编译,

gcc -o duration -lavutil -lavformat duration.c
于 2015-07-27T13:04:25.590 回答
2

How to get duration information (and more) from ffmpeg

I messed around with ffmpeg a while ago and found the learning curve to be pretty steep. So even though the OP asked this question months ago, I'll post some code in case others here on SO are looking to do something similar. The Open() function below is complete but has many asserts and lacks in the way of proper error handling.

Right off, one immediate difference I see is that I used av_open_input_file instead of avformat_open_input. I also didn't use av_dump_format.

Calculating the duration can be tricky, especially with H.264 and MPEG-2; see how durationSec is calculated below.

Note: This example also uses the JUCE C++ Utility Library.

Note2: This code is a modified version of the ffmpeg tutorial.

void VideoCanvas::Open(const char* videoFileName)
{       
    Logger::writeToLog(String(L"Opening video file ") + videoFileName);
    Close();

    AVCodec *pCodec;

    // register all formats and codecs
    av_register_all();  

    // open video file
    int ret = av_open_input_file(&pFormatCtx, videoFileName, NULL, 0, NULL);
    if (ret != 0) {
        Logger::writeToLog("Unable to open video file: " + String(videoFileName));
        Close();
        return;
    }

    // Retrieve stream information
    ret = av_find_stream_info(pFormatCtx);
    jassert(ret >= 0);

    // Find the first video stream
    videoStream = -1;
    audioStream = -1;
    for(int i=0; i<pFormatCtx->nb_streams; i++) {
        if (pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO && videoStream < 0) {
            videoStream = i;            
        }
        if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_AUDIO && audioStream < 0) {
            audioStream = i;
        }
    } // end for i
    jassert(videoStream != -1);
    jassert(audioStream != -1);

    // Get a pointer to the codec context for the video stream
    pCodecCtx=pFormatCtx->streams[videoStream]->codec;
    jassert(pCodecCtx != nullptr);

    /**
      * This is the fundamental unit of time (in seconds) in terms
      * of which frame timestamps are represented. For fixed-fps content,
      * timebase should be 1/framerate and timestamp increments should be
      * identically 1.
      * - encoding: MUST be set by user.
      * - decoding: Set by libavcodec.
      */
    AVRational avr = pCodecCtx->time_base;
    Logger::writeToLog("time_base = " + String(avr.num) + "/" + String(avr.den));

    /**
     * For some codecs, the time base is closer to the field rate than the frame rate.
     * Most notably, H.264 and MPEG-2 specify time_base as half of frame duration
     * if no telecine is used ...
     *
     * Set to time_base ticks per frame. Default 1, e.g., H.264/MPEG-2 set it to 2.
     */
    ticksPerFrame = pCodecCtx->ticks_per_frame;
    Logger::writeToLog("ticks_per_frame = " + String(pCodecCtx->ticks_per_frame));

    durationSec = static_cast<double>(pFormatCtx->streams[videoStream]->duration) * static_cast<double>(ticksPerFrame) / static_cast<double>(avr.den);
    double fH = durationSec / 3600.;
    int     H = static_cast<int>(fH);
    double fM = (fH - H) * 60.;
    int     M = static_cast<int>(fM);
    double fS = (fM - M) * 60.;
    int     S = static_cast<int>(fS);

    Logger::writeToLog("Video stream duration = " + String(H) + "H " + String(M) + "M " + String(fS, 3) + "S");

    // calculate frame rate based on time_base and ticks_per_frame
    frameRate = static_cast<double>(avr.den) / static_cast<double>(avr.num * pCodecCtx->ticks_per_frame);
    Logger::writeToLog("Frame rate = " + String(frameRate) );

    // audio codec context
    if (audioStream != -1) {
        aCodecCtx = pFormatCtx->streams[audioStream]->codec;

        Logger::writeToLog("Audio sample rate = " + String(aCodecCtx->sample_rate));
        Logger::writeToLog("Audio channels    = " + String(aCodecCtx->channels));       
    }
    jassert(aCodecCtx != nullptr);

    // format:
    // The "S" in "S16SYS" stands for "signed", the 16 says that each sample is 16 bits long, 
    // and "SYS" means that the endian-order will depend on the system you are on. This is the
    // format that avcodec_decode_audio2 will give us the audio in.

    // open the audio codec
    if (audioStream != -1) {
        aCodec = avcodec_find_decoder(aCodecCtx->codec_id);
        if (!aCodec) {
            Logger::writeToLog(L"Unsupported codec ID = " + String(aCodecCtx->codec_id) );
            Close();
            return;  // TODO: should we just play video if audio codec doesn't work?
        }
        avcodec_open(aCodecCtx, aCodec);
    }


    // Find the decoder for the video stream
    pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
    if(pCodec == nullptr) {
        jassert(false);
        // fprintf(stderr, "Unsupported codec!\n");
        //return -1; // Codec not found
    }

    // Open video codec
    ret = avcodec_open(pCodecCtx, pCodec);
    jassert(ret >= 0);

    // Allocate video frame
    pFrame=avcodec_alloc_frame();
    jassert(pFrame != nullptr);

    // Allocate an AVFrame structure
    pFrameRGB=avcodec_alloc_frame();
    jassert(pFrameRGB != nullptr);

    int numBytes = avpicture_get_size(PIX_FMT_RGB32, pCodecCtx->width, pCodecCtx->height);
    jassert(numBytes != 0);
    buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
    jassert(buffer != nullptr);

    // note: the pixel format here is RGB, but sws_getContext() needs to be PIX_FMT_BGR24 to match (BGR)
    // this might have to do w/ endian-ness....make sure this is platform independent
    if (m_image != nullptr) delete m_image;
    m_image = new Image(Image::ARGB, pCodecCtx->width, pCodecCtx->height, true);

    int dstW = pCodecCtx->width; // don't rescale
    int dstH = pCodecCtx->height;
    Logger::writeToLog(L"Video width = " + String(dstW));
    Logger::writeToLog(L"Video height = " + String(dstH));

    // this should only have to be done once
    img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt, dstW, dstH, PIX_FMT_RGB32, SWS_FAST_BILINEAR, NULL, NULL, NULL);
    jassert(img_convert_ctx != nullptr);  

    setSize(pCodecCtx->width, pCodecCtx->height);

} // Open()
于 2013-06-04T15:34:15.440 回答
1

您可以从中获取持续时间AVFormatContext,但格式上下文中的持续时间在AV_TIME_BASE
中 阅读更多关于 FFMPEG 时基的信息

avformat.h文档:

/**
流的持续时间,以 AV_TIME_BASE 小数秒为单位。仅当您不知道任何单个流持续时间并且不设置任何一个时才设置此值。如果未设置,则从 AVStream 值推导出来。仅解复用,由 libavformat 设置。
*/
int64_t 持续时间;

因此,您应该使用将时基转换为秒av_q2d(AV_TIME_BASE_Q)

AVFormatContext *fmt_ctx;

/* init fmt_ctx etc. */

double duration_in_sec = (int) (fmt_ctx->duration * av_q2d(AV_TIME_BASE_Q));
于 2021-08-30T11:11:43.753 回答