0

我正在开发一个组合用户拍摄的多个视频剪辑的应用程序。剪辑被记录在相机上,并与另一个视频叠加,然后将录制的剪辑组合成一个长剪辑。每个剪辑的长度由覆盖的视频文件决定。

我正在使用AVAssetExportSessionand exportAsynchronouslyWithCompletionHandler。奇怪的是,这适用于某些剪辑,而不适用于其他剪辑。真正的问题是导出器没有报告任何错误或失败,只是零进度并且从不调用完成处理程序。

我什至不知道从哪里开始寻找问题所在。这是我用来将剪辑组合在一起的功能

- (void) setupAndStitchVideos:(NSMutableArray*)videoData
{
    // Filepath to where the final generated video is stored
    NSURL                       *   exportUrl           = nil;
    // Contains information about a single asset/track
    NSDictionary                *   assetOptions        = nil;
    AVURLAsset                  *   currVideoAsset      = nil;
    AVURLAsset                  *   currAudioAsset      = nil;
    AVAssetTrack                *   currVideoTrack      = nil;
    AVAssetTrack                *   currAudioTrack      = nil;
    // Contains all tracks and time ranges used to build the final composition
    NSMutableArray              *   allVideoTracks      = nil;
    NSMutableArray              *   allVideoRanges      = nil;
    NSMutableArray              *   allAudioTracks      = nil;
    NSMutableArray              *   allAudioRanges      = nil;

    AVMutableCompositionTrack   *   videoTracks         = nil;
    AVMutableCompositionTrack   *   audioTracks         = nil;
    // Misc time values used when calculating a clips start time and total length
    float                           animationLength     = 0.0f;
    float                           clipLength          = 0.0f;
    float                           startTime           = 0.0f;
    CMTime                          clipStart           = kCMTimeZero;
    CMTime                          clipDuration        = kCMTimeZero;
    CMTimeRange                     currRange           = kCMTimeRangeZero;
    // The final composition to be generated and exported
    AVMutableComposition        *   finalComposition    = nil;

    // Cancel any already active exports
    if (m_activeExport)
    {
        [m_activeExport cancelExport];
        m_activeExport = nil;
    }

    // Initialize and setup all composition related member variables
    allVideoTracks      = [[NSMutableArray alloc] init];
    allAudioTracks      = [[NSMutableArray alloc] init];
    allVideoRanges      = [[NSMutableArray alloc] init];
    allAudioRanges      = [[NSMutableArray alloc] init];
    exportUrl           = [NSURL fileURLWithPath:[MobveoAnimation getMergeDestination]];
    finalComposition    = [AVMutableComposition composition];
    videoTracks         = [finalComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    audioTracks         = [finalComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    assetOptions        = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
    animationLength     = m_animation.videoDuration;

    // Define all of the audio and video tracks that will be used in the composition
    for (NSDictionary * currData in videoData)
    {
        currVideoAsset  = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_VIDEO_URL] options:assetOptions];
        currAudioAsset  = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_AUDIO_URL] options:assetOptions];
        currVideoTrack  = [[currVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

        NSArray *audioTracks = [currAudioAsset tracksWithMediaType:AVMediaTypeAudio];
        if ( audioTracks != nil && audioTracks.count > 0 )
        {
            currAudioTrack  = audioTracks[0];
        }
        else
        {
            currAudioTrack = nil;
        }

        clipLength      = animationLength * [(NSNumber*)[currData objectForKey:KEY_STITCH_LENGTH_PERCENTAGE] floatValue];
        clipStart       = CMTimeMakeWithSeconds(startTime, currVideoAsset.duration.timescale);
        clipDuration    = CMTimeMakeWithSeconds(clipLength, currVideoAsset.duration.timescale);

        NSLog(@"Clip length: %.2f", clipLength);
        NSLog(@"Clip Start: %lld", clipStart.value );
        NSLog(@"Clip duration: %lld", clipDuration.value);

        currRange       = CMTimeRangeMake(clipStart, clipDuration);
        [allVideoTracks addObject:currVideoTrack];

        if ( currAudioTrack != nil )
        {
            [allAudioTracks addObject:currAudioTrack];
            [allAudioRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
        }

        [allVideoRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
        startTime       += clipLength;
    }
    [videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:nil];

    if ( allAudioTracks.count > 0 )
    {
        [audioTracks insertTimeRanges:allAudioRanges ofTracks:allAudioTracks atTime:kCMTimeZero error:nil];
    }
    for ( int i = 0; i < allVideoTracks.count - allAudioTracks.count; ++i )
    {
        CMTimeRange curRange = [allVideoRanges[i] CMTimeRangeValue];
        [audioTracks insertEmptyTimeRange:curRange];
    }

    // Delete any previous exported video files that may already exist
    [[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];

    // Begin the composition generation and export process!
    m_activeExport = [[AVAssetExportSession alloc] initWithAsset:finalComposition presetName:AVAssetExportPreset1280x720];
    [m_activeExport setOutputFileType:AVFileTypeQuickTimeMovie];
    [m_activeExport setOutputURL:exportUrl];
    NSLog(@"Exporting async");
    [m_activeExport exportAsynchronouslyWithCompletionHandler:^(void)
     {
         NSLog(@"Export complete");
         // Cancel the update timer
         [m_updateTimer invalidate];
         m_updateTimer = nil;

         // Dismiss the displayed dialog
         [m_displayedDialog hide:TRUE];
         m_displayedDialog = nil;

         // Re-enable touch events
         [[UIApplication sharedApplication] endIgnoringInteractionEvents];

         // Report the success/failure result
         switch (m_activeExport.status)
         {
             case AVAssetExportSessionStatusFailed:
                 [self performSelectorOnMainThread:@selector(videoStitchingFailed:) withObject:m_activeExport.error waitUntilDone:FALSE];
                 break;
             case AVAssetExportSessionStatusCompleted:
                 [self performSelectorOnMainThread:@selector(videoStitchingComplete:) withObject:m_activeExport.outputURL waitUntilDone:FALSE];
                 break;
         }

         // Clear our reference to the completed export
         m_activeExport = nil;
     }];
}

编辑

感谢 Josh 在评论中我注意到我没有使用错误参数。在它现在失败的情况下,我在插入视频轨道的时间范围时遇到了非常有用的“操作无法完成”错误:

NSError *videoError = nil;
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:&videoError];

if ( videoError != nil )
{
    NSLog(@"Error adding video track: %@", videoError);
}

输出:

Error adding video track: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17426dd00 {NSUnderlyingError=0x174040cc0 "The operation couldn’t be completed. (OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed}

然而值得注意的是,在整个代码库中没有任何地方被urlWithString使用,fileUrlWithPath所以这不是问题。

4

1 回答 1

0

从您for in对 videoData 数组的枚举来看,在您初始化组合成员变量之后,看起来好像您正在阻塞调用线程。尽管允许访问每个 AVAssetTrack 实例,但键的值并不总是立即可用并且同步运行。

AVSynchronousKeyValueLoading相反,请尝试使用协议注册更改通知。Apple 的文档应该可以帮助您解决问题并让您顺利上路!

以下是我为 AVFoundation 汇总的一些 Apple 建议:

在此处输入图像描述

希望这能解决问题!祝你好运,如果您有任何其他问题/问题,请告诉我。

于 2015-08-11T11:49:04.553 回答