Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reading the GPS data from the image returned by the camera in iOS iphone

I need to get the GPS coordinates of an image taken with the iOS device's camera. I do not care about the Camera Roll images, just the image taken with UIImagePickerControllerSourceTypeCamera.

I've read many stackoverflow answers, like Get Exif data from UIImage - UIImagePickerController, which either assumes you are using the AssetsLibrary framework, which doesn't seem to work on camera images, or use CoreLocaiton to get the latitude/longitude from the app itself, not from the image.

Using CoreLocation is not an option. That will not give me the coordinates when the shutter button was pressed. (With the CoreLocation based solutions, you either need to record the coords before you bring up the camera view or after, and of course if the device is moving the coordinates will be wrong. This method should work with a stationary device.)

I am iOS5 only, so I don't need to support older devices. This is also for a commercial product so I cannot use http://code.google.com/p/iphone-exif/.

So, what are my options for reading the GPS data from the image returned by the camera in iOS5? All I can think of right now is to save the image to Camera Roll and then use the AssetsLibrary, but that seems hokey.

Thanks!


Here's the code I wrote based on Caleb's answer.

    UIImage *image =  [info objectForKey:UIImagePickerControllerOriginalImage];      NSData *jpeg = UIImageJPEGRepresentation(image,1.0);     CGImageSourceRef  source ;     source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);      NSDictionary *metadataNew = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL);        NSLog(@"%@",metadataNew); 

and my Console shows:

    2012-04-26 14:15:37:137 ferret[2060:1799] {         ColorModel = RGB;         Depth = 8;         Orientation = 6;         PixelHeight = 1936;         PixelWidth = 2592;         "{Exif}" =     {             ColorSpace = 1;             PixelXDimension = 2592;             PixelYDimension = 1936;         };         "{JFIF}" =     {             DensityUnit = 0;             JFIFVersion =         (                 1,                 1             );             XDensity = 1;             YDensity = 1;         };         "{TIFF}" =     {             Orientation = 6;         };     } 

No latitude/longitude.

like image 349
Paul Cezanne Avatar asked Apr 24 '12 16:04

Paul Cezanne


People also ask

How do I see GPS coordinates on Iphone Photos?

See where a photo was takenOpen a photo, then swipe up to see photo information. Tap the map or address link to see more details. To change the location or address where the photo was taken, tap Adjust. Enter the new location in the search field, then tap the new location under Map Locations.

How do I get GPS data from a photo?

In Windows, all you have to do is right-click a picture file, select “Properties,” and then click the “Details” tab in the properties window. Look for the Latitude and Longitude coordinates under GPS.

How do I view metadata on iOS Photos?

Locate and long-press the saved photo to invoke the contextual menu. Here tap info from the menu. On the Information page, tap Show More and scroll down to see detailed metadata. Once satisfied, tap Done.


2 Answers

The problem is that since iOS 4 UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage]; strips the geolocation out. To solve this problem you have to use the original photo path to get access to the full image metadata. With something like this:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {     NSURL *referenceURL = [info objectForKey:UIImagePickerControllerReferenceURL];     ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];     [library assetForURL:referenceURL resultBlock:^(ALAsset *asset) {         ALAssetRepresentation *rep = [asset defaultRepresentation];         NSDictionary *metadata = rep.metadata;         NSLog(@"%@", metadata);          CGImageRef iref = [rep fullScreenImage] ;          if (iref) {             self.imageView.image = [UIImage imageWithCGImage:iref];         }     } failureBlock:^(NSError *error) {         // error handling     }]; 

The output should be something like:

{     ColorModel = RGB;     DPIHeight = 72;     DPIWidth = 72;     Depth = 8;     Orientation = 6;     PixelHeight = 1936;     PixelWidth = 2592;     "{Exif}" =     {         ApertureValue = "2.970854";         BrightnessValue = "1.115874";         ColorSpace = 1;         ComponentsConfiguration =         (             0,             0,             0,             1         );         DateTimeDigitized = "2012:07:14 21:55:05";         DateTimeOriginal = "2012:07:14 21:55:05";         ExifVersion =         (             2,             2,             1         );         ExposureMode = 0;         ExposureProgram = 2;         ExposureTime = "0.06666667";         FNumber = "2.8";         Flash = 24;         FlashPixVersion =         (             1,             0         );         FocalLength = "3.85";         ISOSpeedRatings =         (             200         );         MeteringMode = 5;         PixelXDimension = 2592;         PixelYDimension = 1936;         SceneCaptureType = 0;         SensingMethod = 2;         Sharpness = 2;         ShutterSpeedValue = "3.9112";         SubjectArea =         (             1295,             967,             699,             696         );         WhiteBalance = 0;     };     "{GPS}" =     {         Altitude = "1167.528";         AltitudeRef = 0;         ImgDirection = "278.8303";         ImgDirectionRef = T;         Latitude = "15.8235";         LatitudeRef = S;         Longitude = "47.99416666666666";         LongitudeRef = W;         TimeStamp = "00:55:04.59";     };     "{TIFF}" =     {         DateTime = "2012:07:14 21:55:05";         Make = Apple;         Model = "iPhone 4";         Orientation = 6;         ResolutionUnit = 2;         Software = "5.1.1";         XResolution = 72;         YResolution = 72;         "_YCbCrPositioning" = 1;     }; } 
like image 173
Carlos Borges Avatar answered Sep 18 '22 06:09

Carlos Borges


We have worked a lot with the camera and UIImagePickerController and, at least up to and including iOS 5.1.1, it does not return location data in the metadata for either photos or videos shot with UIImagePickerController.

It doesn't matter whether location services is enabled for the Camera app or not; this controls the Camera app's use of location services, not the camera function within UIImagePickerController.

Your app will need to use the CLLocation class to get the location and then add it to the image or video returned from the camera. Whether your app can get the location will depend on whether the user authorizes access to location services for your app. And note that the user can disable location services for you app (or entirely for the device) at any time via Settings > Location Services.

like image 43
Chris Markle Avatar answered Sep 22 '22 06:09

Chris Markle