Metadata Extraction on Mobile: iOS and Android App Development Guide
Mobile metadata extraction works differently from desktop tools. iOS and Android each have their own APIs, permission models, and memory constraints that shape how your app reads EXIF, video, and document metadata. This guide walks through platform-native approaches with working code, covers the permission changes introduced in iOS 14 and Android's scoped storage, and shows how to handle extracted metadata at scale.
Why Mobile Metadata Extraction Differs from Desktop
Desktop metadata tools like ExifTool and Python's Pillow library run with direct filesystem access, generous memory, and no permission prompts. Mobile apps operate under a different set of constraints that affect every part of your extraction pipeline.
The key differences come down to three areas:
- Sandboxed storage: iOS and Android apps cannot freely browse the filesystem. On iOS, each app has its own sandbox and must request access to the Photos library or use document pickers. Android enforced scoped storage starting with API 29 (Android 10), limiting direct file path access.
- Permission prompts: Users must grant explicit consent before your app reads photo metadata. iOS introduced limited photo library access in iOS 14, where users can select specific photos your app can see. Android requires READ_MEDIA_IMAGES (API 33+) or the older READ_EXTERNAL_STORAGE permission.
- Memory and battery: Mobile devices have less RAM than servers or desktops. Processing a batch of high-resolution RAW files that would be trivial on a laptop can cause memory pressure warnings or app termination on a phone. You need to extract metadata without loading full image data into memory.
These constraints mean you cannot port a desktop extraction script to mobile. You need platform-native APIs designed for these environments.
Reading Image Metadata on iOS
iOS provides two primary paths for reading image metadata: the ImageIO framework for direct file-level extraction and the Photos framework for accessing the user's photo library.
Reading EXIF with CGImageSource
CGImageSource is the lower-level option. It reads metadata directly from image data without decoding the full image into a bitmap, which keeps memory usage low.
import ImageIO
func extractMetadata(from imageData: Data) -> [String: Any]? {
guard let source = CGImageSourceCreateWithData(imageData as CFData, nil),
let properties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [String: Any] else {
return nil
}
return properties
}
The returned dictionary contains nested dictionaries keyed by format. The EXIF data lives under kCGImagePropertyExifDictionary, GPS data under kCGImagePropertyGPSDictionary, and TIFF properties under kCGImagePropertyTIFFDictionary. Each contains typed values: strings for camera make and model, doubles for GPS coordinates, integers for image dimensions.
if let exif = properties[kCGImagePropertyExifDictionary as String] as? [String: Any] {
let exposureTime = exif[kCGImagePropertyExifExposureTime as String]
let isoSpeed = exif[kCGImagePropertyExifISOSpeedRatings as String]
let focalLength = exif[kCGImagePropertyExifFocalLength as String]
}
CGImageSource works with JPEG, PNG, TIFF, HEIF, and RAW formats. For HEIF files (the default capture format on modern iPhones), it reads the full EXIF payload including MakerApple tags that contain Apple-specific data like lens information and HDR gain maps.
Accessing Photos Library Metadata
The Photos framework (PhotoKit) gives you access to the user's photo library with richer metadata, including location, creation date, favorite status, and asset type.
import Photos
func requestPhotoAccess() {
PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in
switch status {
case .authorized:
self.fetchPhotoMetadata()
case .limited:
self.fetchPhotoMetadata() // Only selected photos
case .denied, .restricted:
self.showPermissionDeniedUI()
default:
break
}
}
}
Since iOS 14, users can grant "limited" access, meaning your app only sees photos the user explicitly selected. Your code must handle PHAuthorizationStatus.limited as a valid state, not an error. The metadata you receive for each selected asset is still complete. The limitation is which assets you can access, not what data each asset contains.
To read the full EXIF data from a PHAsset, request the image data and pass it through CGImageSource:
let options = PHImageRequestOptions()
options.version = .current
options.isSynchronous = false
PHImageManager.default().requestImageDataAndOrientation(
for: asset, options: options
) { data, _, _, _ in
guard let imageData = data else { return }
let metadata = self.extractMetadata(from: imageData)
}
Android Metadata Extraction with ExifInterface and MediaStore
Android offers ExifInterface for image EXIF data and MediaStore for querying the device's media database. The two serve different purposes: ExifInterface reads raw tag values from file bytes, while MediaStore provides indexed, queryable metadata across the device's entire media collection.
Reading EXIF with ExifInterface
The AndroidX ExifInterface library supports reading metadata from JPEG, PNG, WebP, DNG, CR2, NEF, ARW, RAF, ORF, SRW, PEF, and HEIF files. Add it to your module's build.gradle:
dependencies {
implementation("androidx.exifinterface:exifinterface:1.3.7")
}
With scoped storage (Android 10+), you typically receive a content URI rather than a file path. Use the InputStream constructor:
import androidx.exifinterface.media.ExifInterface
fun extractExif(context: Context, uri: Uri): Map<String, String?> {
val inputStream = context.contentResolver.openInputStream(uri)
?: return emptyMap()
return inputStream.use { stream ->
val exif = ExifInterface(stream)
mapOf(
"make" to exif.getAttribute(ExifInterface.TAG_MAKE),
"model" to exif.getAttribute(ExifInterface.TAG_MODEL),
"datetime" to exif.getAttribute(ExifInterface.TAG_DATETIME),
"exposure" to exif.getAttribute(ExifInterface.TAG_EXPOSURE_TIME),
"iso" to exif.getAttribute(ExifInterface.TAG_PHOTOGRAPHIC_SENSITIVITY),
"focalLength" to exif.getAttribute(ExifInterface.TAG_FOCAL_LENGTH),
"width" to exif.getAttribute(ExifInterface.TAG_IMAGE_WIDTH),
"height" to exif.getAttribute(ExifInterface.TAG_IMAGE_LENGTH)
)
}
}
ExifInterface reads over 30 tag groups covering camera settings, GPS coordinates, image dimensions, orientation, white balance, flash status, and more. For GPS extraction specifically, use the helper methods:
val latLong = exif.latLong // Returns DoubleArray? with [latitude, longitude]
Handling Android Permissions
Android 13 (API 33) split the old READ_EXTERNAL_STORAGE into granular permissions. For photo metadata, declare READ_MEDIA_IMAGES in your manifest:
<uses-permission android:name="android.permission.READ_MEDIA_IMAGES" />
<!-- For Android 12 and below -->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"
android:maxSdkVersion="32" />
Request the appropriate permission at runtime based on the API level:
val permission = if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) {
Manifest.permission.READ_MEDIA_IMAGES
} else {
Manifest.permission.READ_EXTERNAL_STORAGE
}
ActivityCompat.requestPermissions(activity, arrayOf(permission), REQUEST_CODE)
Querying MediaStore for Batch Operations
When you need metadata across many files (building a gallery, filtering by date range, finding photos with GPS data), MediaStore is more efficient than opening each file individually:
val projection = arrayOf(
MediaStore.Images.Media._ID,
MediaStore.Images.Media.DISPLAY_NAME,
MediaStore.Images.Media.DATE_TAKEN,
MediaStore.Images.Media.SIZE,
MediaStore.Images.Media.WIDTH,
MediaStore.Images.Media.HEIGHT
)
val cursor = context.contentResolver.query(
MediaStore.Images.Media.EXTERNAL_CONTENT_URI,
projection,
null, null,
"${MediaStore.Images.Media.DATE_TAKEN} DESC"
)
MediaStore returns indexed data quickly, but it only exposes a subset of EXIF fields. For full EXIF extraction on specific files, combine MediaStore queries with ExifInterface reads.
Manage Extracted Metadata at Scale
Upload files from your mobile app to Fast.io workspaces. Metadata Views automatically extract and organize properties into a queryable grid, with no custom extraction code needed on the server side. Free tier includes 50 GB storage and 5,000 monthly credits.
Video Metadata Extraction on Both Platforms
Video metadata follows different standards than images. Mobile video files use container formats (MP4, MOV, WebM) with embedded tracks, codecs, and timing information that EXIF-based tools cannot read.
iOS: AVAsset for Video Properties
AVFoundation's AVAsset class reads video metadata without loading the full file into memory:
import AVFoundation
func extractVideoMetadata(from url: URL) async -> [String: Any] {
let asset = AVURLAsset(url: url)
var result: [String: Any] = [:]
if let duration = try? await asset.load(.duration) {
result["duration"] = CMTimeGetSeconds(duration)
}
if let tracks = try? await asset.loadTracks(withMediaType: .video),
let track = tracks.first {
let size = try? await track.load(.naturalSize)
result["width"] = size?.width
result["height"] = size?.height
}
// Read embedded metadata items
if let metadata = try? await asset.load(.metadata) {
for item in metadata {
if let key = item.commonKey?.rawValue,
let value = try? await item.load(.value) {
result[key] = value
}
}
}
return result
}
This returns resolution, duration, creation date, and any embedded metadata like GPS location (which iOS records in MOV files captured with the Camera app).
Android: MediaMetadataRetriever
Android's MediaMetadataRetriever extracts properties from video and audio files:
import android.media.MediaMetadataRetriever
fun extractVideoMetadata(context: Context, uri: Uri): Map<String, String?> {
val retriever = MediaMetadataRetriever()
return try {
retriever.setDataSource(context, uri)
mapOf(
"duration" to retriever.extractMetadata(
MediaMetadataRetriever.METADATA_KEY_DURATION
),
"width" to retriever.extractMetadata(
MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH
),
"height" to retriever.extractMetadata(
MediaMetadataRetriever.METADATA_KEY_VIDEO_HEIGHT
),
"mimeType" to retriever.extractMetadata(
MediaMetadataRetriever.METADATA_KEY_MIMETYPE
),
"bitrate" to retriever.extractMetadata(
MediaMetadataRetriever.METADATA_KEY_BITRATE
),
"date" to retriever.extractMetadata(
MediaMetadataRetriever.METADATA_KEY_DATE
),
"rotation" to retriever.extractMetadata(
MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION
)
)
} finally {
retriever.release()
}
}
For more advanced video analysis on Android, the Media3 MetadataRetriever (part of the Jetpack Media3 library) provides codec information, track details, and sampling rates without initializing a full player instance.
Cross-Platform Libraries
If you are building with Kotlin Multiplatform or need consistent behavior across platforms, Ashampoo's Kim library provides EXIF read/write support for both Android and iOS targets from shared Kotlin code. It accepts ByteArray input on all platforms and adds platform-specific overloads for InputStream (Android) and NSData (iOS).
Practical Patterns for Production Apps
Reading a single photo's EXIF data is straightforward. The complexity comes when you process hundreds or thousands of files, handle format edge cases, and deal with missing or corrupted metadata.
Batch Processing Without Memory Pressure
Both platforms let you read metadata without decoding full images, but you still need to manage how many files you process concurrently. On iOS, use a serial DispatchQueue or an actor to limit concurrent CGImageSource operations:
actor MetadataExtractor {
func extractAll(from assets: [PHAsset]) async -> [String: [String: Any]] {
var results: [String: [String: Any]] = [:]
for asset in assets {
let metadata = await extractSingle(asset)
results[asset.localIdentifier] = metadata
}
return results
}
}
On Android, use Kotlin coroutines with a limited dispatcher to control concurrency:
val extractionDispatcher = Dispatchers.IO.limitedParallelism(4)
suspend fun extractBatch(uris: List<Uri>): List<Map<String, String?>> =
coroutineScope {
uris.map { uri ->
async(extractionDispatcher) {
extractExif(context, uri)
}
}.awaitAll()
}
Handling Missing and Malformed Data
Not every image contains complete metadata. Screenshots have no EXIF data. Images shared through messaging apps often have metadata stripped. Some cameras write non-standard tag values. Your extraction code needs fallback logic:
- Check for null returns on every tag read. Both CGImageSource and ExifInterface return nil/null for missing tags.
- Validate GPS coordinates before using them. Coordinates of (0.0, 0.0) usually indicate missing data, not a location in the Gulf of Guinea.
- Parse date strings defensively. EXIF date format is "YYYY:MM:DD HH:MM:SS" (with colons in the date), which differs from ISO 8601. Some cameras also write timezone-unaware dates.
Privacy Considerations
If your app uploads photos to a server, consider whether you should strip location metadata before transmission. Many users do not realize their photos contain GPS coordinates. Best practice is to give users explicit control: show what metadata will be included and let them opt out of location sharing.
For apps that extract metadata for local use only (sorting a gallery, finding duplicate photos, generating statistics), you still need to communicate in your permission request why your app needs photo library access.
Scaling Metadata Workflows Beyond the Device
Mobile metadata extraction is often just the first step. The extracted data needs to go somewhere: a database, a search index, a content management system, or a file management platform where teams can query and act on it.
From Extraction to Structured Data
Once you have extracted metadata on-device, you might send it to a backend API along with the file itself. The challenge is maintaining the association between the file and its metadata through upload, processing, and storage.
A common pattern is to extract metadata locally, include it as structured JSON in the upload request, and let the server index it alongside the file. This avoids re-extracting metadata server-side and gives the user immediate feedback about what was captured.
For teams that manage large volumes of photos, videos, or documents, manual metadata organization breaks down quickly. You need a system that can automatically extract, index, and make metadata queryable.
Automated Extraction with Fast.io Metadata Views
Fast.io's Metadata Views take a different approach from on-device extraction. Instead of writing extraction code for each file type, you describe the fields you want in plain language. The system designs a typed schema (Text, Integer, Decimal, Boolean, URL, JSON, Date & Time), matches files in your workspace, and populates a sortable, filterable grid.
This works well as the destination for files captured on mobile. Your app uploads photos or documents to a Fast.io workspace, and Metadata Views automatically extract the properties you defined, across PDFs, images, Word docs, spreadsheets, and even scanned pages. No separate extraction pipeline to maintain.
For development teams building mobile apps that generate or collect files, Fast.io's workspace model also provides file versioning, granular permissions, and audit trails. Agents and humans share the same workspaces, so an automated pipeline can process uploads while team members review results in the same interface.
Other platforms handle parts of this workflow. Google Cloud Vision API extracts labels and text from images. AWS Rekognition provides similar image analysis. Firebase ML Kit runs some extraction on-device. The tradeoff is that these services are purpose-built for specific extraction tasks, while a workspace platform like Fast.io combines storage, extraction, and collaboration in one layer.
Frequently Asked Questions
How do I extract EXIF data in an iOS app?
Use the ImageIO framework's CGImageSource. Create a source from your image data with CGImageSourceCreateWithData, then call CGImageSourceCopyPropertiesAtIndex to get a dictionary containing EXIF, GPS, TIFF, and format-specific metadata. This reads metadata without decoding the full image bitmap, keeping memory usage low.
What Android API reads photo metadata?
The AndroidX ExifInterface library is the primary API. Create an ExifInterface instance from an InputStream (required for content URIs on Android 10+), then call getAttribute() with tag constants like TAG_MAKE, TAG_DATETIME, or TAG_GPS_LATITUDE. For batch queries across many files, use MediaStore's ContentResolver query instead.
How do I access file metadata on mobile?
Both iOS and Android require user permission to access photo library files. On iOS, use PHPhotoLibrary.requestAuthorization to get access, then request image data through PHImageManager. On Android, request READ_MEDIA_IMAGES (API 33+) or READ_EXTERNAL_STORAGE (older versions), then open files through ContentResolver.
Can I extract video metadata in a mobile app?
Yes. On iOS, use AVFoundation's AVAsset class to read duration, resolution, codec information, and embedded metadata like GPS location. On Android, use MediaMetadataRetriever to extract duration, dimensions, bitrate, rotation, and MIME type. Neither requires loading the full video into memory.
What happens when iOS users grant limited photo access?
When a user selects "Select Photos" instead of "Allow Full Access" (available since iOS 14), your app can only see the specific photos they chose. The metadata for each selected photo is still complete. You get full EXIF, GPS, and camera data for accessible assets. The limitation applies to which assets are visible, not what data they contain.
How do I handle metadata from photos shared via messaging apps?
Most messaging apps strip EXIF data from photos during compression and re-encoding. Your extraction code should handle null or missing metadata gracefully. Check for null on every tag read, validate GPS coordinates (0,0 usually means missing data), and provide sensible defaults or clear indicators when metadata is unavailable.
Related Resources
Manage Extracted Metadata at Scale
Upload files from your mobile app to Fast.io workspaces. Metadata Views automatically extract and organize properties into a queryable grid, with no custom extraction code needed on the server side. Free tier includes 50 GB storage and 5,000 monthly credits.