HLS Streaming from Mobile Device
HLS streaming from mobile — not the same as HLS playback. Here the phone is the source: records video in segments, uploads them to server, and updates m3u8 playlist. This is a less common scenario than RTMP streaming, but needed for CDN service integrations that accept origin-push via HTTP.
How HLS-push Works from Device
Classic scheme: phone → HTTP PUT/POST TS-segments + update .m3u8 → origin HTTP server → CDN → viewers. Alternative: phone → RTMP → media server → HLS for viewers (then HLS generated by server, not phone).
Direct HLS-push from device supported by: Apple Media Stream Segmenter (macOS only), CloudFront with WebDAV, Akamai Media Services, custom nginx with ngx_http_dav_module.
iOS: ReplayKit + HLS Segmenter
iOS has no built-in HLS encoder for live streaming — only AVAssetExportSession for files. For live HLS from device you build pipeline manually:
AVCaptureSession → AVCaptureVideoDataOutput → VideoToolbox (encode H.264) → accumulate NAL units into TS-segments by 2–4 seconds → URLSession.uploadTask to server → update m3u8 manifest.
Assembling MPEG-TS from NAL units — manual PES packet packing with PAT/PMT tables. No ready Swift library. So in practice:
Option 1: FFmpegKit. FFmpegKit.executeAsync with HLS output:
-f avfoundation -i 0:0 -c:v h264_videotoolbox -b:v 2M -hls_time 2 -hls_list_size 5 -hls_flags delete_segments -method PUT http://origin/stream/index.m3u8
h264_videotoolbox — hardware encoder. -method PUT uploads segments and playlist to server. Works, moderate CPU load.
Option 2: RTMP → server → HLS. Phone pushes RTMP via HaishinKit, media server (Nginx-RTMP, MediaMTX) converts to HLS. More reliable but adds 1–2 seconds latency.
Android: MediaMuxer + HLS
Android MediaMuxer supports output format MUXER_OUTPUT_MPEG_4 — this is mp4, not TS. For HLS need either TS (MPEG-2 Transport Stream) or fMP4 (fragmented MP4, supported HLS v7+).
fMP4 segments with MediaMuxer: write to ByteArrayOutputStream via custom MediaMuxer with MUXER_OUTPUT_MPEG_4, set fragmented MP4 flag via MediaFormat.KEY_IS_ADTS + vendor-specific path. Non-standard and unreliable.
Reliable way: FFmpegKit — same approach as iOS, just for Android. h264_mediacodec instead of h264_videotoolbox:
-f android_camera -i 0 -c:v h264_mediacodec -b:v 2M -hls_time 2 -hls_list_size 5 -method PUT http://origin/stream/index.m3u8
Low-Latency HLS (LL-HLS)
Standard HLS — 6–30 seconds latency. LL-HLS (HLS Part) reduces to 2–4 seconds. For this server and client must support EXT-X-PART directives (Apple WWDC 2019).
Generating LL-HLS parts from mobile (0.2–0.5 sec part duration) — extremely difficult without specialized encoder. More practical: device pushes RTMP with small buffer, media server (Nimble Streamer, Wowza with LL-HLS plugin) generates LL-HLS for viewers.
Use Case: Recording and Simultaneous Streaming
Often needed: record video → save to device + simultaneously stream. AVCaptureSession with two outputs: AVCaptureMovieFileOutput (local recording) and AVCaptureVideoDataOutput (capture for streaming). Both work in parallel — modern CPU (A15+) handles both streams without issues.
Timeline
HLS streaming from mobile via FFmpegKit with upload to origin server — 2–3 days. With custom TS-segmenter and LL-HLS — 1–2 weeks.







