我正在尝试将从 a 收集到的单个 YUV420P 图像编码CMSampleBuffer
为 an AVPacket
,以便我可以使用 RTMP 通过网络发送 h264 视频。
发布的代码示例似乎可以作为avcodec_encode_video2
返回0
(成功)但got_output
也是0
(AVPacket
为空)。
有没有人在可能知道我做错了什么的 iOS 设备上编码视频的经验?
- (void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
// sampleBuffer now contains an individual frame of raw video frames
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
// access the data
int width = CVPixelBufferGetWidth(pixelBuffer);
int height = CVPixelBufferGetHeight(pixelBuffer);
int bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0);
unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
// Convert the raw pixel base to h.264 format
AVCodec *codec = 0;
AVCodecContext *context = 0;
AVFrame *frame = 0;
AVPacket packet;
//avcodec_init();
avcodec_register_all();
codec = avcodec_find_encoder(AV_CODEC_ID_H264);
if (codec == 0) {
NSLog(@"Codec not found!!");
return;
}
context = avcodec_alloc_context3(codec);
if (!context) {
NSLog(@"Context no bueno.");
return;
}
// Bit rate
context->bit_rate = 400000; // HARD CODE
context->bit_rate_tolerance = 10;
// Resolution
context->width = width;
context->height = height;
// Frames Per Second
context->time_base = (AVRational) {1,25};
context->gop_size = 1;
//context->max_b_frames = 1;
context->pix_fmt = PIX_FMT_YUV420P;
// Open the codec
if (avcodec_open2(context, codec, 0) < 0) {
NSLog(@"Unable to open codec");
return;
}
// Create the frame
frame = avcodec_alloc_frame();
if (!frame) {
NSLog(@"Unable to alloc frame");
return;
}
frame->format = context->pix_fmt;
frame->width = context->width;
frame->height = context->height;
avpicture_fill((AVPicture *) frame, rawPixelBase, context->pix_fmt, frame->width, frame->height);
int got_output = 0;
av_init_packet(&packet);
avcodec_encode_video2(context, &packet, frame, &got_output)
// Unlock the pixel data
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
// Send the data over the network
[self uploadData:[NSData dataWithBytes:packet.data length:packet.size] toRTMP:self.rtmp_OutVideoStream];
}
注意:众所周知,这段代码存在内存泄漏,因为我没有释放动态分配的内存。
更新
我更新了我的代码以使用@pogorskiy 方法。如果输出返回 1,我只会尝试上传帧,并在完成视频帧编码后清除缓冲区。