36

我需要获取用 iOS 设备的相机拍摄的图像的 GPS 坐标。我不关心相机胶卷图像,只关心使用 UIImagePickerControllerSourceTypeCamera 拍摄的图像。

我已经阅读了许多 stackoverflow 答案,例如Get Exif data from UIImage - UIImagePickerController,它假设您正在使用 AssetsLibrary 框架,该框架似乎不适用于相机图像,或者使用 CoreLocaiton 从应用程序获取纬度/经度本身,而不是来自图像。

使用 CoreLocation不是一种选择。按下快门按钮时,这不会给我坐标。(使用基于 CoreLocation 的解决方案,您需要在打开相机视图之前或之后记录坐标,当然,如果设备正在移动,坐标将是错误的。这种方法应该适用于固定设备。)

我只有iOS5,所以我不需要支持旧设备。这也适用于商业产品,所以我不能使用http://code.google.com/p/iphone-exif/

那么,在 iOS5 中从相机返回的图像中读取 GPS 数据的选项有哪些?我现在能想到的就是将图像保存到相机胶卷,然后使用 AssetsLibrary,但这似乎很不礼貌。

谢谢!


这是我根据 Caleb 的回答编写的代码。

    UIImage *image =  [info objectForKey:UIImagePickerControllerOriginalImage];

    NSData *jpeg = UIImageJPEGRepresentation(image,1.0);
    CGImageSourceRef  source ;
    source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);

    NSDictionary *metadataNew = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL);  

    NSLog(@"%@",metadataNew);

我的控制台显示:

    2012-04-26 14:15:37:137 ferret[2060:1799] {
        ColorModel = RGB;
        Depth = 8;
        Orientation = 6;
        PixelHeight = 1936;
        PixelWidth = 2592;
        "{Exif}" =     {
            ColorSpace = 1;
            PixelXDimension = 2592;
            PixelYDimension = 1936;
        };
        "{JFIF}" =     {
            DensityUnit = 0;
            JFIFVersion =         (
                1,
                1
            );
            XDensity = 1;
            YDensity = 1;
        };
        "{TIFF}" =     {
            Orientation = 6;
        };
    }

没有纬度/经度。

4

9 回答 9

17

问题是,由于 iOS 4UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];剥离了地理位置。要解决此问题,您必须使用原始照片路径来访问完整的图像元数据。像这样:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
    NSURL *referenceURL = [info objectForKey:UIImagePickerControllerReferenceURL];
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    [library assetForURL:referenceURL resultBlock:^(ALAsset *asset) {
        ALAssetRepresentation *rep = [asset defaultRepresentation];
        NSDictionary *metadata = rep.metadata;
        NSLog(@"%@", metadata);

        CGImageRef iref = [rep fullScreenImage] ;

        if (iref) {
            self.imageView.image = [UIImage imageWithCGImage:iref];
        }
    } failureBlock:^(NSError *error) {
        // error handling
    }];

输出应该是这样的:

{
    ColorModel = RGB;
    DPIHeight = 72;
    DPIWidth = 72;
    Depth = 8;
    Orientation = 6;
    PixelHeight = 1936;
    PixelWidth = 2592;
    "{Exif}" =     {
        ApertureValue = "2.970854";
        BrightnessValue = "1.115874";
        ColorSpace = 1;
        ComponentsConfiguration =         (
            0,
            0,
            0,
            1
        );
        DateTimeDigitized = "2012:07:14 21:55:05";
        DateTimeOriginal = "2012:07:14 21:55:05";
        ExifVersion =         (
            2,
            2,
            1
        );
        ExposureMode = 0;
        ExposureProgram = 2;
        ExposureTime = "0.06666667";
        FNumber = "2.8";
        Flash = 24;
        FlashPixVersion =         (
            1,
            0
        );
        FocalLength = "3.85";
        ISOSpeedRatings =         (
            200
        );
        MeteringMode = 5;
        PixelXDimension = 2592;
        PixelYDimension = 1936;
        SceneCaptureType = 0;
        SensingMethod = 2;
        Sharpness = 2;
        ShutterSpeedValue = "3.9112";
        SubjectArea =         (
            1295,
            967,
            699,
            696
        );
        WhiteBalance = 0;
    };
    "{GPS}" =     {
        Altitude = "1167.528";
        AltitudeRef = 0;
        ImgDirection = "278.8303";
        ImgDirectionRef = T;
        Latitude = "15.8235";
        LatitudeRef = S;
        Longitude = "47.99416666666666";
        LongitudeRef = W;
        TimeStamp = "00:55:04.59";
    };
    "{TIFF}" =     {
        DateTime = "2012:07:14 21:55:05";
        Make = Apple;
        Model = "iPhone 4";
        Orientation = 6;
        ResolutionUnit = 2;
        Software = "5.1.1";
        XResolution = 72;
        YResolution = 72;
        "_YCbCrPositioning" = 1;
    };
}
于 2012-09-02T19:57:56.337 回答
13

我们在相机上做了很多工作UIImagePickerController,至少在 iOS 5.1.1 之前(包括 iOS 5.1.1),它不会在元数据中返回使用UIImagePickerController.

是否为相机应用程序启用了位置服务并不重要;这控制了相机应用程序对位置服务的使用,而不是UIImagePickerController.

您的应用将需要使用CLLocation该类来获取位置,然后将其添加到从相机返回的图像或视频中。您的应用能否获取位置取决于用户是否授权您的应用访问位置服务。请注意,用户可以随时通过服务为您的应用(或完全为设备)禁用位置Settings > Location服务。

于 2012-07-06T17:29:02.230 回答
5

您没有在发布的代码中使用来自相机的图像数据,而是生成了它的 JPEG 表示形式,这实际上会丢弃所有元数据。image.CGImage像迦勒建议的那样使用。

还:

这也适用于商业产品,所以我不能使用http://code.google.com/p/iphone-exif/

作者非常明确地指出,商业许可是可用的。

于 2012-04-26T18:28:40.500 回答
4

一种可能性是在相机可见时让 CoreLocation 运行。将每个 CCLocation 连同样本的时间一起记录到一个数组中。当照片返回时,找到它的时间,然后匹配数组中最近的 CClocation。

听起来很笨拙,但它会起作用。

于 2012-04-27T13:22:02.497 回答
3

不能说我需要在我自己的东西中完全做到这一点,但从文档中可以清楚地看出,如果您正在使用UIImagePickerController,您可以获取用户刚刚从-imagePicker:didFinishPickingMediaWithInfo:委托方法中获取的图像。使用密钥UIImagePickerControllerOriginalImage获取图像。

获得图像后,您应该能够访问其属性,包括 EXIF 数据,如QA1654 使用 ImageIO 访问图像属性中所述。要创建 CGImageSource,我会查看CGImageSourceCreateWithData()并使用您从 UIImage 的CGImage方法获得的数据。获得图像源后,您可以通过CGImageSourceCopyProperties().

于 2012-04-26T17:00:31.793 回答
1

正如Chris Markle所指出的,Apple 确实从 EXIF 中删除了 GPS 数据。但是您可以打开图像的 RAW 数据,并自己解析数据或使用第三方库来执行此操作,例如.

这是一个示例代码:

- (void) imagePickerController: (UIImagePickerController *) picker
 didFinishPickingMediaWithInfo: (NSDictionary *) info {

    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    [library assetForURL:[info objectForKey:UIImagePickerControllerReferenceURL]
             resultBlock:^(ALAsset *asset) {

                 ALAssetRepresentation *image_representation = [asset defaultRepresentation];
                 NSUInteger size = (NSUInteger)image_representation.size;
                 // create a buffer to hold image data
                 uint8_t *buffer = (Byte*)malloc(size);
                 NSUInteger length = [image_representation getBytes:buffer fromOffset: 0.0  length:size error:nil];

                 if (length != 0)  {

                     // buffer -> NSData object; free buffer afterwards
                     NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:size freeWhenDone:YES];

                     EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
                     [jpegScanner scanImageData: adata];
                     EXFMetaData* exifData = jpegScanner.exifMetaData;

                     id latitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLatitude]];
                     id longitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLongitude]];
                     id datetime = [exifData tagValue:[NSNumber numberWithInt:EXIF_DateTime]];
                     id t = [exifData tagValue:[NSNumber numberWithInt:EXIF_Model]];

                     self.locationLabel.text = [NSString stringWithFormat:@"Local: %@ - %@",latitudeValue,longitudeValue];
                     self.dateLavel.text = [NSString stringWithFormat:@"Data: %@", datetime];

                 }
                 else {
                     NSLog(@"image_representation buffer length == 0");
                 }
             }
            failureBlock:^(NSError *error) {
                NSLog(@"couldn't get asset: %@", error);
            }
     ];
}
于 2014-01-16T18:26:47.337 回答
0

这是在 iOS 8 上测试的,适用于视频,所以它应该对照片进行一些调整。

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {

    NSURL *videoUrl = (NSURL *)[info objectForKey:UIImagePickerControllerMediaURL];
    NSString *moviePath = [videoUrl path];

    if ( UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(moviePath) ) {

        ALAssetsLibrary *assetLibrary = [[ALAssetsLibrary alloc] init];

        [assetLibrary assetForURL:[info objectForKey:UIImagePickerControllerReferenceURL] resultBlock:^(ALAsset *asset) {

            CLLocation *location = [asset valueForProperty:ALAssetPropertyLocation];
            NSLog(@"Location Meta: %@", location);

        } failureBlock:^(NSError *error) {
            NSLog(@"Video Date Error: %@", error);
        }];

    }

}
于 2014-09-23T07:25:24.800 回答
0

快速回答:

import AssetsLibrary
import CoreLocation


// MARK: - UIImagePickerControllerDelegate
extension ViewController: UIImagePickerControllerDelegate {
    func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
        defer {
            dismiss(animated: true, completion: nil)
        }
        guard picker.sourceType == .photoLibrary else {
            return
        }
        guard let url = info[UIImagePickerControllerReferenceURL] as? URL else {
            return
        }

        let library = ALAssetsLibrary()
        library.asset(for: url, resultBlock: { (asset) in
            guard let coordinate = asset?.value(forProperty: ALAssetPropertyLocation) as? CLLocation else {
                return
            }
            print("\(coordinate)")

            // Getting human-readable address.
            let geocoder = CLGeocoder()
            geocoder.reverseGeocodeLocation(coordinate, completionHandler: { (placemarks, error) in
                guard let placemark = placemarks?.first else {
                    return
                }
                print("\(placemark.addressDictionary)")
            })
        }, failureBlock: { (error: Error?) in
            print("Unable to read metadata: \(error)")
        })
    }
}
于 2016-11-01T08:06:03.330 回答
-1

在您的 UIImagePickerController 委托中,执行以下操作:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
  NSDictionary *metadata = [info valueForKey:UIImagePickerControllerMediaMetadata];

  // metadata now contains all the image metadata.  Extract GPS data from here.
}
于 2012-05-02T17:28:04.033 回答