7

使用新的 Photo 框架,我可以NSData使用requestImageDataForAsset. 我还可以使用PHImageFileURLKey返回的info NSDictionary.

[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {

                //imageData contains the correct data for images and videos
                NSLog(@"info - %@", info);
                NSURL* fileURL = [info objectForKey:@"PHImageFileURLKey"];
}];

这适用于图像和普通视频。

但是,当资产是 PHAssetMediaSubtypeVideoHighFrameRate(慢动作视频)时,返回的数据对应于包含视频第一帧的 JPG 文件(NSData、dataUTI 和 info 字典都指向同一个 jpg 文件)。例如,这是慢动作视频返回的 URL 和 dataUTI:

PHImageFileURLKey = "file:///var/mobile/Media/PhotoData/Metadata/DCIM/100APPLE/IMG_0642.JPG"; PHImageFileUTIKey = "public.jpeg";

为什么会这样?如何访问慢动作视频的 NSData/NSURL 而不是这个 JPG 预览?

4

4 回答 4

18

在发疯并测试每个选项后,我发现了问题。

为慢动作视频返回 JPG 图像的责任是该属性的默认PHImageRequestOptionsVersionCurrent值。PHImageRequestOptions.version

只需将版本分配给 PHImageRequestOptionsVersionUnadjusted 或 PHImageRequestOptionsVersionOriginal 将返回原始慢动作视频。

PHImageRequestOptions * imageRequestOptions = [[PHImageRequestOptions alloc] init];

imageRequestOptions.version = PHImageRequestOptionsVersionUnadjusted;
// or 
imageRequestOptions.version = PHImageRequestOptionsVersionOriginal;

我认为这是一种意外行为,因为我不希望慢动作视频的“当前”版本是静止图像(可能是应用了慢动作效果的视频,但不是照片)。

希望这对某人有用。

于 2014-10-02T11:19:27.297 回答
12

重要的是要注意慢动作视频的类型是 AVComposition 而不是 AVURLAsset。AVComposition 对象将来自多个源的媒体数据组合在一起。

导出慢动作视频

为了实现这一点,我基本上经历了三个步骤:

  1. 为视频创建输出 URL
  2. 配置导出会话
  3. 导出视频并获取 URL!

PHVideoRequestOptions *options = [PHVideoRequestOptions new];
        options.networkAccessAllowed = YES;
        [[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
            if(([asset isKindOfClass:[AVComposition class]] && ((AVComposition *)asset).tracks.count == 2)){
                //slow motion videos. See Here: https://overflow.buffer.com/2016/02/29/slow-motion-video-ios/

                //Output URL of the slow motion file.
                NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
                NSString *documentsDirectory = paths.firstObject;
                NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"mergeSlowMoVideo-%d.mov",arc4random() % 1000]];
                NSURL *url = [NSURL fileURLWithPath:myPathDocs];

                //Begin slow mo video export
                AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
                exporter.outputURL = url;
                exporter.outputFileType = AVFileTypeQuickTimeMovie;
                exporter.shouldOptimizeForNetworkUse = YES;

                [exporter exportAsynchronouslyWithCompletionHandler:^{
                    dispatch_async(dispatch_get_main_queue(), ^{
                        if (exporter.status == AVAssetExportSessionStatusCompleted) {
                            NSURL *URL = exporter.outputURL;
                            self.filePath=URL.absoluteString;



                            // NSData *videoData = [NSData dataWithContentsOfURL:URL];
                            //
                            //// Upload
                            //[self uploadSelectedVideo:video data:videoData];
                        }
                    });
                }];


            } 
        }];

请参阅这个精彩的博客,了解 iOS 中的慢动作视频。

于 2016-09-08T09:37:50.557 回答
6

以下 Swift 3/4 的代码片段

PHImageManager.default().requestAVAsset(forVideo: asset, 
                                         options: nil, 
                                         resultHandler: { (asset, _, _) in

            // AVAsset has two sub classes: AVComposition and AVAssetURL
            // AVComposition for slow mo vid
            // AVAssetURL for normal videos

            // For slow motion video checking for AVCompostion
            // Creating an exporter to write the video into local file path and using the same to play/upload

            if asset!.isKind(of: AVComposition.self){

                let avCompositionAsset = asset as! AVComposition

                if avCompositionAsset.tracks.count > 1{

                    let exporter = AVAssetExportSession(asset: avCompositionAsset, presetName: AVAssetExportPresetHighestQuality)
                    exporter!.outputURL = self.fetchOutputURL()
                    exporter!.outputFileType = AVFileTypeMPEG4
                    exporter!.shouldOptimizeForNetworkUse = true

                    exporter!.exportAsynchronously {
                        DispatchQueue.main.sync {
                          // Use this url for uploading or playing a video
                           let url = exporter!.outputURL
                        }
                    }
                }
            }else{

                // Normal video, are stored as AVAssetURL

                let url = (asset as! AVURLAsset).url
            }
        })

// Fetch local path

 func fetchOutputURL() -> URL{
     let documentDirectory = getDocumentsDirectory() as NSString
     let path = documentDirectory.appendingPathComponent("test.mp4")
     return URL(fileURLWithPath:path)
 }
于 2017-10-04T07:38:20.320 回答
1

//视频慢动作

PHVideoRequestOptions *options=[[PHVideoRequestOptions alloc]init];
options.version=PHVideoRequestOptionsVersionOriginal;

从 PHImageManager 请求 AVAsset

[[PHImageManager defaultManager] requestAVAssetForVideo:videoAsset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info)
 {         
     if ([asset isKindOfClass:[AVURLAsset class]])
     {
         // use URL to get file content

         NSURL *URL = [(AVURLAsset *)asset URL];
         NSData *videoData=[NSData dataWithContentsOfURL:URL];
         NSNumber *fileSizeValue = nil;
         [URL getResourceValue:&fileSizeValue forKey:NSURLFileSizeKey error:nil];
      }
}
于 2015-11-03T04:51:28.800 回答