Implementing Video Cropping and Editing in Mobile Applications
A video editor in a mobile application is one of the most labor-intensive media tasks. Not because the API is complex, but because users expect responsiveness: the timeline must scroll smoothly, preview after cropping must appear instantly, and final export must not block the UI for 30 seconds.
Key Operations and Their Implementation
Time-based trimming. On iOS — AVAssetExportSession with timeRange:
exportSession.timeRange = CMTimeRange(
start: CMTime(seconds: startTime, preferredTimescale: 600),
duration: CMTime(seconds: duration, preferredTimescale: 600)
)
preferredTimescale: 600 — standard for video, divisible by 24/25/30/60 fps. Using timescale: 1 causes frame rounding errors.
On Android — MediaMuxer + MediaExtractor for frame-accurate trimming without re-encoding (only if keyframes align). Otherwise — Transformer from media3 with explicit re-encoding.
Clip concatenation. iOS: AVMutableComposition — add AVMutableCompositionTrack for video and audio, insert via insertTimeRange(_:of:at:). Important: audio tracks combine separately from video tracks, otherwise desynchronization occurs.
Android: media3 Transformer with Composition API (1.2.x+). Clip sequence — EditedMediaItemSequence.
Text overlays. iOS: AVVideoComposition + AVVideoCompositionCoreAnimationTool. Create CATextLayer, add animation via CABasicAnimation, pass to animationTool. On Android — OverlayEffect in media3 Transformer or draw via Canvas on each frame (slow, only for short content).
Timeline Preview
The most noticeable detail to users — a strip with video frames below the player. Generate via AVAssetImageGenerator on iOS:
generator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
// update UI on main thread
}
requestedTimeToleranceBefore/After = .zero gives precise frames but slower. For timeline, CMTime(seconds: 0.1, preferredTimescale: 600) suffices — nearest frames to the marker.
On Android — MediaMetadataRetriever.getFrameAtTime() in IO Dispatcher.
Export
Always asynchronously, with progress. iOS: exportSession.progress via Timer every 0.1 s. Android: Transformer.addListener with onProgress(progress: Float).
Output format: H.264 in MP4 — maximum compatibility. H.265 (HEVC) — smaller size, but not all servers accept it without re-encoding.
| iOS Preset | Resolution | Bitrate (approx.) |
|---|---|---|
AVAssetExportPreset640x480 |
640×480 | ~1.5 Mbps |
AVAssetExportPreset1280x720 |
1280×720 | ~5 Mbps |
AVAssetExportPreset1920x1080 |
1920×1080 | ~10 Mbps |
| Custom AVVideoSettings | any | controllable |
Flutter: video_editor
video_editor (pub.dev) provides UI components (timeline, cropper) on top of ffmpeg_kit_flutter for actual operations. Pro — cross-platform Flutter code for UI. Con — FFmpeg processes video on CPU, slower than native AVFoundation/MediaCodec solutions. For MVP and simple content, it works; for production editor with frequent use — prefer native implementation.
Timeline
Time-based trimming + export — 2–3 days. Full-featured editor with timeline, clip concatenation and text overlays — 5–8 days.







