Camera (Video) Integration into a Mobile Application
Video recording is heavier than photo in every way: more resolution, more memory, more failure points. A dating app wants 30-second video clips — seems trivial, but AVAssetExportSession fails with exportFailed on iPhone SE with iOS 16.4 at certain resolution and bitrate combinations.
Main Difficulties
Recording Length Limit. iOS has no built-in limit in AVCaptureMovieFileOutput — you need to implement timer manually and explicitly call stopRecording(). Miss this — user records 40-minute video that can't be uploaded afterward.
File Size. 30 seconds in 4K on iPhone 15 Pro — about 600 MB at default settings. For most apps this is unacceptable. Before upload, need transcoding: AVAssetExportSession with AVAssetExportPresetMediumQuality or specific AVVideoCompressionPropertiesKey + AVVideoAverageBitRateKey. On Android — MediaCodec directly or via Transformer from media3.
Gallery Selection. PHPickerViewController with PHPickerFilter.videos on iOS 14+. Important nuance: loadFileRepresentation(forTypeIdentifier: "public.movie") copies file to temp directory — must process it before next app launch or explicitly move to Documents.
How We Implement
iOS. AVCaptureSession + AVCaptureMovieFileOutput for custom camera with full parameter control. For simple cases — UIImagePickerController with mediaTypes: [kUTTypeMovie as String]. Compress video via AVAssetExportSession, choosing preset by purpose: AVAssetExportPreset960x540 for preview, AVAssetExportPreset1280x720 for main content.
let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPreset1280x720)
exportSession?.outputURL = outputURL
exportSession?.outputFileType = .mp4
exportSession?.exportAsynchronously { ... }
Android. CameraX VideoCapture API stabilized in 1.3.x. Record via Recorder → PendingRecording → Recording. Time limit — stop() in Handler.postDelayed. Compression — Transformer from androidx.media3:media3-transformer (replacement of deprecated TranscodingTransformer).
Flutter. camera (pub.dev) for custom camera, image_picker for gallery selection. For transcoding — video_compress or native code via MethodChannel.
Upload
Upload video files in chunks (multipart or resumable upload). On iOS — URLSessionUploadTask with background configuration. On Android — WorkManager + OkHttp, so upload doesn't interrupt when app minimizes. Send progress to UI via StateFlow or LiveData.
Preview of Recorded Video
After recording, user should see preview before sending. On iOS — AVPlayerViewController with local URL or custom AVPlayer. On Android — ExoPlayer with local Uri. Important: write temp files to FileManager.temporaryDirectory (iOS) or cacheDir (Android) and clean after successful upload, otherwise drafts accumulate to gigabytes over sessions.
React Native
react-native-vision-camera (v3+) — modern choice with CameraX support on Android and AVFoundation on iOS. Key advantage over react-native-camera (deprecated) — Frame Processors: you can run JavaScript on each frame via Worklets, e.g., for QR detection during recording. For gallery selection — react-native-image-picker with mediaType: 'video'.
Timeline
Standard integration (record with limit, select from gallery, compress, upload) — 2–3 days. Custom camera interface with own UI controls — add 1–2 days.







