7

我一直在努力解决在 iOS 设备上捕获期间和之后控制视频方向的问题的几个方面。感谢Apple以前的答案和文档,我已经能够弄清楚了。但是,现在我想将一些视频推送到网站上,我遇到了一些特殊的问题。我在这个问题中特别概述了这个问题,并且建议的解决方案结果证明需要在视频编码期间设置方向选项。

可能是这样,但我不知道如何去做。有关设置方向的文档是关于正确设置它以在设备上显示的,我已经实施了此处找到的建议。但是,此建议并未针对非 Apple 软件(例如 VLC 或 Chrome 浏览器)正确设置方向。

谁能提供有关如何在设备上正确设置方向以使其对所有查看软件正确显示的见解?

4

6 回答 6

10

最后,根据@Aaron Vegh 和@Prince 的回答,我想出了我的解决方案://Converting video

+(void)convertMOVToMp4:(NSString *)movFilePath completion:(void (^)(NSString *mp4FilePath))block{


AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:movFilePath]  options:nil];

AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

AVMutableComposition* composition = [AVMutableComposition composition];


AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                            preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                               ofTrack:sourceAudioTrack
                                atTime:kCMTimeZero error:nil];




AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition
                                                                      presetName:AVAssetExportPresetMediumQuality];


NSString *exportPath =  [movFilePath stringByReplacingOccurrencesOfString:@".MOV" withString:@".mp4"];


NSURL * exportUrl = [NSURL fileURLWithPath:exportPath];


assetExport.outputFileType = AVFileTypeMPEG4;
assetExport.outputURL = exportUrl;
assetExport.shouldOptimizeForNetworkUse = YES;
assetExport.videoComposition = [self getVideoComposition:videoAsset composition:composition];

[assetExport exportAsynchronouslyWithCompletionHandler:
 ^(void ) {
     switch (assetExport.status)
     {
         case AVAssetExportSessionStatusCompleted:
             //                export complete
                    if (block) {
                         block(exportPath);
                }
             break;
         case AVAssetExportSessionStatusFailed:
             block(nil);
             break;
         case AVAssetExportSessionStatusCancelled:
            block(nil);
             break;
     }
 }];
}

//获取当前方向

  +(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset composition:( AVMutableComposition*)composition{
    BOOL isPortrait_ = [self isVideoPortrait:asset];


    AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];


    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];

    AVMutableVideoCompositionLayerInstruction *layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];

    CGAffineTransform transform = videoTrack.preferredTransform;
    [layerInst setTransform:transform atTime:kCMTimeZero];


    AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
    inst.layerInstructions = [NSArray arrayWithObject:layerInst];


    AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
    videoComposition.instructions = [NSArray arrayWithObject:inst];

    CGSize videoSize = videoTrack.naturalSize;
    if(isPortrait_) {
        NSLog(@"video is portrait ");
        videoSize = CGSizeMake(videoSize.height, videoSize.width);
    }
    videoComposition.renderSize = videoSize;
    videoComposition.frameDuration = CMTimeMake(1,30);
    videoComposition.renderScale = 1.0;
    return videoComposition;
   }

//获取视频

+(BOOL) isVideoPortrait:(AVAsset *)asset{
BOOL isPortrait = FALSE;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks    count] > 0) {
    AVAssetTrack *videoTrack = [tracks objectAtIndex:0];

    CGAffineTransform t = videoTrack.preferredTransform;
    // Portrait
    if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0)
    {
        isPortrait = YES;
    }
    // PortraitUpsideDown
    if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0)  {

        isPortrait = YES;
    }
    // LandscapeRight
    if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0)
    {
        isPortrait = FALSE;
    }
    // LandscapeLeft
    if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0)
    {
        isPortrait = FALSE;
    }
}
return isPortrait;

}

于 2014-09-14T12:48:37.317 回答
7

在此处的 Apple 文档中,它指出:

客户端现在可以在其 AVCaptureVideoDataOutput -captureOutput:didOutputSampleBuffer:fromConnection: 委托回调中接收物理旋转的 CVPixelBuffer。在之前的 iOS 版本中,前置摄像头总是在 AVCaptureVideoOrientationLandscapeLeft 中提供缓冲区,而后置摄像头总是在 AVCaptureVideoOrientationLandscapeRight 中提供缓冲区。支持所有 4 种 AVCaptureVideoOrientation,并且旋转是硬件加速的。为了请求缓冲区旋转,客户端在 AVCaptureVideoDataOutput 的视频 AVCaptureConnection 上调用 -setVideoOrientation:。请注意,物理旋转缓冲区确实会带来性能成本,因此只有在必要时才请求旋转。例如,如果您希望使用 AVAssetWriter 将旋转视频写入 QuickTime 电影文件,

因此,Aaron Vegh 发布的使用 AVAssetExportSession 的解决方案有效,但不是必需的。就像 Apple 文档所说的那样,如果您想正确设置方向以便它在 VLC 等非苹果快速播放器或使用 Chrome 的网络上播放,您必须在 AVCaptureConnection 上为 AVCaptureVideoDataOutput 设置视频方向。如果您尝试为 AVAssetWriterInput 设置它,您将得到 VLC 和 Chrome 等播放器的错误方向。

这是我在设置捕获会话期间设置它的代码:

// DECLARED AS PROPERTIES ABOVE
@property (strong,nonatomic) AVCaptureDeviceInput *audioIn;
@property (strong,nonatomic) AVCaptureAudioDataOutput *audioOut;
@property (strong,nonatomic) AVCaptureDeviceInput *videoIn;
@property (strong,nonatomic) AVCaptureVideoDataOutput *videoOut;
@property (strong,nonatomic) AVCaptureConnection *audioConnection;
@property (strong,nonatomic) AVCaptureConnection *videoConnection;
------------------------------------------------------------------
------------------------------------------------------------------

-(void)setupCaptureSession{
// Setup Session
self.session = [[AVCaptureSession alloc]init];
[self.session setSessionPreset:AVCaptureSessionPreset640x480];

// Create Audio connection ----------------------------------------
self.audioIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self getAudioDevice] error:nil];
if ([self.session canAddInput:self.audioIn]) {
    [self.session addInput:self.audioIn];
}

self.audioOut = [[AVCaptureAudioDataOutput alloc]init];
dispatch_queue_t audioCaptureQueue = dispatch_queue_create("Audio Capture Queue", DISPATCH_QUEUE_SERIAL);
[self.audioOut setSampleBufferDelegate:self queue:audioCaptureQueue];
if ([self.session canAddOutput:self.audioOut]) {
    [self.session addOutput:self.audioOut];
}
self.audioConnection = [self.audioOut connectionWithMediaType:AVMediaTypeAudio];

// Create Video connection ----------------------------------------
self.videoIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack] error:nil];
if ([self.session canAddInput:self.videoIn]) {
    [self.session addInput:self.videoIn];
}

self.videoOut = [[AVCaptureVideoDataOutput alloc]init];
[self.videoOut setAlwaysDiscardsLateVideoFrames:NO];
[self.videoOut setVideoSettings:nil];
dispatch_queue_t videoCaptureQueue =  dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
[self.videoOut setSampleBufferDelegate:self queue:videoCaptureQueue];
if ([self.session canAddOutput:self.videoOut]) {
    [self.session addOutput:self.videoOut];
}

self.videoConnection = [self.videoOut connectionWithMediaType:AVMediaTypeVideo];
// SET THE ORIENTATION HERE -------------------------------------------------
[self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
// --------------------------------------------------------------------------

// Create Preview Layer -------------------------------------------
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];
CGRect bounds = self.videoView.bounds;
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.bounds = bounds;
previewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
[self.videoView.layer addSublayer:previewLayer];

// Start session
[self.session startRunning];

}

于 2013-06-19T15:59:28.493 回答
3

万一其他人也在寻找这个答案,这是我制作的方法(修改了一下以简化):

- (void)encodeVideoOrientation:(NSURL *)anOutputFileURL
{
CGAffineTransform rotationTransform;
CGAffineTransform rotateTranslate;
CGSize renderSize;

switch (self.recordingOrientation)
{
    // set these 3 values based on orientation

}


AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:anOutputFileURL options:nil];

AVAssetTrack *sourceVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

AVMutableComposition* composition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                               ofTrack:sourceVideoTrack
                                atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform];

AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                            preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                               ofTrack:sourceAudioTrack
                                atTime:kCMTimeZero error:nil];



AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
[layerInstruction setTransform:rotateTranslate atTime:kCMTimeZero];

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.renderScale = 1.0;
videoComposition.renderSize = renderSize;
instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
videoComposition.instructions = [NSArray arrayWithObject: instruction];

AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition
                                                                      presetName:AVAssetExportPresetMediumQuality];

NSString* videoName = @"export.mov";
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];

NSURL * exportUrl = [NSURL fileURLWithPath:exportPath];

if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
{
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}

assetExport.outputFileType = AVFileTypeMPEG4;
assetExport.outputURL = exportUrl;
assetExport.shouldOptimizeForNetworkUse = YES;
assetExport.videoComposition = videoComposition;

[assetExport exportAsynchronouslyWithCompletionHandler:
 ^(void ) {
     switch (assetExport.status)
     {
         case AVAssetExportSessionStatusCompleted:
             //                export complete
             NSLog(@"Export Complete");
             break;
         case AVAssetExportSessionStatusFailed:
             NSLog(@"Export Failed");
             NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
             //                export error (see exportSession.error)
             break;
         case AVAssetExportSessionStatusCancelled:
             NSLog(@"Export Failed");
             NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
             //                export cancelled
             break;
     }
 }];

}

不幸的是,这些东西的文档记录很差,但是通过将其他 SO 问题中的示例串在一起并阅读头文件,我能够让它工作。希望这对其他人有帮助!

于 2012-11-26T03:20:57.930 回答
2

使用下面的这些来method设置correct orientation根据video asset orientationAVMutableVideoComposition

-(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset
{
  AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
  AVMutableComposition *composition = [AVMutableComposition composition];
  AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
  CGSize videoSize = videoTrack.naturalSize;
  BOOL isPortrait_ = [self isVideoPortrait:asset];
  if(isPortrait_) {
      NSLog(@"video is portrait ");
      videoSize = CGSizeMake(videoSize.height, videoSize.width);
  }
  composition.naturalSize     = videoSize;
  videoComposition.renderSize = videoSize;
  // videoComposition.renderSize = videoTrack.naturalSize; //
  videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate, 600);

  AVMutableCompositionTrack *compositionVideoTrack;
  compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
  [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];
  AVMutableVideoCompositionLayerInstruction *layerInst;
  layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
  [layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero];
  AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
  inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
  inst.layerInstructions = [NSArray arrayWithObject:layerInst];
  videoComposition.instructions = [NSArray arrayWithObject:inst];
  return videoComposition;
}


-(BOOL) isVideoPortrait:(AVAsset *)asset
{
  BOOL isPortrait = FALSE;
  NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
  if([tracks    count] > 0) {
    AVAssetTrack *videoTrack = [tracks objectAtIndex:0];

    CGAffineTransform t = videoTrack.preferredTransform;
    // Portrait
    if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0)
    {
        isPortrait = YES;
    }
    // PortraitUpsideDown
    if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0)  {

        isPortrait = YES;
    }
    // LandscapeRight
    if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0)
    {
        isPortrait = FALSE;
    }
    // LandscapeLeft
    if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0)
    {
        isPortrait = FALSE;
    }
   }
  return isPortrait;
}
于 2013-05-03T05:06:03.760 回答
1

从 iOS 5 开始,您可以使用此处记录的 AVCaptureVideoDataOutput 请求旋转的 CVPixelBuffers 。这为您提供了正确的方向,而无需使用 AVAssetExportSession 再次重新处理视频。

于 2013-05-04T05:31:48.263 回答
0

这是@Jagie 代码的最新快速版本。

extension AVURLAsset
{
    func exportVideo(presetName: String = AVAssetExportPresetHighestQuality, outputFileType: AVFileType = .mp4, fileExtension: String = "mp4", then completion: @escaping (URL?) -> Void)
    {
        let filename = url.deletingPathExtension().appendingPathExtension(fileExtension).lastPathComponent
        let outputURL = FileManager.default.temporaryDirectory.appendingPathComponent(filename)
        
        do { // delete old video, if already exists
                try FileManager.default.removeItem(at: outputURL)
        } catch {
            print(error.localizedDescription)
        }
        
        guard let sourceAudioTrack = self.tracks(withMediaType: .audio).first else { return }
        let composition = AVMutableComposition()
        let compositionAudioTrack = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
        do{
            try compositionAudioTrack?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: self.duration), of: sourceAudioTrack, at: CMTime.zero)
        }catch(let e){
            print("error: \(e)")
        }
        
        
        if let session = AVAssetExportSession(asset: composition, presetName: presetName) {
            session.outputURL = outputURL
            session.outputFileType = outputFileType
            let start = CMTimeMakeWithSeconds(0.0, preferredTimescale: 0)
            let range = CMTimeRangeMake(start: start, duration: duration)
            session.timeRange = range
            session.shouldOptimizeForNetworkUse = true
            
            session.videoComposition =  getVideoComposition(asset: self, composition: composition)
            
            session.exportAsynchronously {
                switch session.status {
                case .completed:
                    completion(outputURL)
                case .cancelled:
                    debugPrint("Video export cancelled.")
                    completion(nil)
                case .failed:
                    let errorMessage = session.error?.localizedDescription ?? "n/a"
                    debugPrint("Video export failed with error: \(errorMessage)")
                    completion(nil)
                default:
                    break
                }
            }
        } else {
            completion(nil)
        }
    }
    
    
    private func getVideoComposition(asset: AVAsset, composition: AVMutableComposition) -> AVMutableVideoComposition{
        let isPortrait = isVideoPortrait()

        let compositionVideoTrack: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID:kCMPersistentTrackID_Invalid)!
        
        let videoTrack:AVAssetTrack = asset.tracks(withMediaType: .video).first!
        do{
        try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: asset.duration), of: videoTrack, at: CMTime.zero)
        }catch(let e){
            print("Error: \(e)")
        }
        
        let layerInst:AVMutableVideoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: compositionVideoTrack)
        
        let transform = videoTrack.preferredTransform
        layerInst.setTransform(transform, at: CMTime.zero)

        let inst:AVMutableVideoCompositionInstruction = AVMutableVideoCompositionInstruction()
        inst.timeRange = CMTimeRangeMake(start: CMTime.zero, duration: asset.duration);
        inst.layerInstructions = [layerInst]
        
        let videoComposition: AVMutableVideoComposition = AVMutableVideoComposition()
        videoComposition.instructions = [inst]
        
        var videoSize:CGSize = videoTrack.naturalSize;
        if(isPortrait) {
            print("video is portrait")
            videoSize = CGSize(width: videoSize.height, height: videoSize.width)
        }else{
            print("video is landscape")
        }
        
        videoComposition.renderSize = videoSize;
        videoComposition.frameDuration = CMTimeMake(value: 1,timescale: 30);
        videoComposition.renderScale = 1.0;
        return videoComposition;
    }
    
    func isVideoPortrait() -> Bool{
        var isPortrait = false
        let tracks = self.tracks(withMediaType: .video)
        if(tracks.count > 0) {
            let videoTrack:AVAssetTrack = tracks.first!;

            let t:CGAffineTransform = videoTrack.preferredTransform;
            // Portrait
            if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0)
            {
                isPortrait = true;
            }
            // PortraitUpsideDown
            if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0)  {

                isPortrait = true;
            }
            // LandscapeRight
            if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0)
            {
                isPortrait = false;
            }
            // LandscapeLeft
            if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0)
            {
                isPortrait = false;
            }
        }
        return isPortrait;
    }
}
于 2022-02-11T13:39:40.760 回答