Live Stream Recording from Mobile App
Recording stream "on the fly" — encode one video stream simultaneously to two places: server (RTMP/SRT) and local file (MP4/MOV). It's not just "save what's transmitting" — RTMP and file recording have different GOP structure, bitrate, and keyframe requirements.
Problem of Splitting Streams
Naive approach on iOS — run second AVAssetWriter parallel with broadcast encoder. Doesn't work: VideoToolbox session (VTCompressionSession) can't be used simultaneously in two consumers. Attempting it gives -12401 kVTVideoEncoderNotAvailableNowErr.
Correct approach: one VTCompressionSession → encoded CMSampleBuffer → fanout to two writers. After receiving encoded buffer in VTCompressionOutputCallback write it to both RTMP queue and AVAssetWriter.
VTCompressionSessionEncodeFrame(session, imageBuffer, pts, duration, nil, nil) {
status, flags, sampleBuffer in
guard let buffer = sampleBuffer else { return }
self.rtmpQueue.enqueue(buffer) // → stream
self.fileWriterInput.append(buffer) // → local file
}
Both calls shouldn't be synchronous on one thread — if RTMP queue blocked (network down), fileWriterInput.append shouldn't wait. Use two independent DispatchQueue.
Audio-Video Synchronization in Recording
Common problem: audio in MP4 file drifts relative to video. Reason — AVAudioEngine and AVCaptureVideoDataOutput work on different timelines. CMSampleBuffer from camera uses kCMClockType_System, audio buffers — AVAudioTime with hostTime.
Solution: normalize all timestamps relative to CACurrentMediaTime() on recording start, use it as base clock. For audio — AVAudioSourceNode with explicit AVAudioTime, for video — CMSampleBufferGetPresentationTimeStamp minus startup offset.
Delta between audio and video in file shouldn't exceed 40ms — perception threshold of desync.
ReplayKit Alternative (Alternative Scenario)
If stream via ReplayKit (RPBroadcastSampleHandler), recording organized differently: handler receives RPSampleBufferType.video and RPSampleBufferType.audioApp — can write both to AVAssetWriter simultaneously without own encoder.
Limitation: ReplayKit adds 2–5 seconds delay to capture. For screen stream acceptable, for camera stream — no.
Storage Management
Before starting recording check available space:
let attrs = try FileManager.default.attributesOfFileSystem(forPath: NSHomeDirectory())
let freeSpace = attrs[.systemFreeSize] as? Int64 ?? 0
let estimatedSize = Int64(bitrate / 8) * expectedDurationSeconds
guard freeSpace > estimatedSize * 2 else { /* warning */ }
Factor 2 — reserve for AVAssetWriter temp files and OS. At 4 Mbps one hour stream takes ~1.8 GB.
Segment recording (new file every 30 minutes) reduces data loss risk on crash and simplifies later upload to server.
Timeline
Basic parallel recording (iOS, single stream): 1–1.5 weeks. Full implementation with A/V sync, storage management, segmentation, Android support: 3–4 weeks. Cost calculated individually.







