Integrating Live Streaming into a Mobile Application
Live Streaming is one of the most complex media tasks in mobile development: simultaneously capturing video/audio from camera, encoding in real-time, transmitting over network, receiving on viewers' side and playback with minimal latency. Each of these steps is a separate engineering challenge.
System Architecture
A mobile application is just one component. Comprehensive live-streaming requires:
- Sender (streamer): capture + encoding + stream transmission to media server
- Media server: receiving RTMP, transcoding, distribution via HLS/RTMP/WebRTC to viewers
- Viewer: playback of HLS/RTMP/WebRTC stream
- Signaling (for WebRTC): exchange of SDP/ICE candidates
Popular media servers: Wowza Streaming Engine, nginx-rtmp-module, Ant Media Server (free for basic scenarios), AWS IVS (Amazon Interactive Video Service — managed service).
Camera Streaming: RTMP
iOS. No native RTMP client. Use HaishinKit — a mature library with RTMP, SRT, HLS output support:
import HaishinKit
let rtmpConnection = RTMPConnection()
let rtmpStream = RTMPStream(connection: rtmpConnection)
// Camera capture
rtmpStream.attachCamera(AVCaptureDevice.default(for: .video))
rtmpStream.attachAudio(AVCaptureDevice.default(for: .audio))
// Encoding parameters
rtmpStream.videoSettings = VideoCodecSettings(
videoSize: CGSize(width: 1280, height: 720),
bitRate: 2_000_000, // 2 Mbps
frameInterval: 2, // GOP size
frameRate: 30
)
rtmpStream.audioSettings = AudioCodecSettings(
bitRate: 128_000 // 128 kbps AAC
)
// Preview
let hkView = MTHKView(frame: previewView.bounds)
rtmpStream.addOutput(hkView)
previewView.addSubview(hkView)
// Connect and publish
rtmpConnection.connect("rtmp://your-server/live")
rtmpStream.publish("stream-key")
Android. Larix Broadcaster SDK — commercial, professional. For free solutions: rtmp-rtsp-stream-client-java (github.com/pedroSG94) or Streampack (github.com/thibaultbee). CameraX for capture + MediaCodec for encoding + custom RTMP client.
// Streampack example
val streamer = CameraStreamer(context, enableAudio = true)
streamer.configure(
AudioConfig(startBitrate = 128_000, sampleRate = 44100, channelConfig = CHANNEL_IN_STEREO),
VideoConfig(startBitrate = 2_000_000, resolution = Size(1280, 720), fps = 30)
)
streamer.startPreview(surface)
streamer.connect("rtmp://your-server/live/stream-key")
streamer.startStream()
Alternatives to RTMP: SRT and WHIP
SRT (Secure Reliable Transport) — replacement for RTMP for unstable networks. HaishinKit supports SRT natively. Latency ~1–3 seconds with packet losses up to 30%.
WHIP (WebRTC HTTP Ingest Protocol) — standard from 2022 for ingest via WebRTC. Latency < 500 ms. Supported by Cloudflare Stream, Ant Media 2.x, mediasoup. On mobile — WebRTC via WebRTC.org SDK.
Live Stream Viewing by Audiences
HLS (standard for mass viewing). Latency 5–30 seconds. iOS: native AVPlayer. Android: ExoPlayer. Media server splits RTMP stream into HLS segments of 2–6 seconds.
Low-Latency HLS (LL-HLS). Latency 1–3 seconds. Apple TV+ uses this approach. AVPlayer supports LL-HLS natively from iOS 14. Server must support EXT-X-PART directives.
WebRTC for ultra-low latency (< 500 ms). WebRTC.org SDK on clients, signaling server (Socket.io, custom) for SDP exchange. Ant Media, Janus, mediasoup, Cloudflare Calls — server options.
| Technology | Latency | Scalability | Complexity |
|---|---|---|---|
| RTMP → HLS | 15–30 s | high | low |
| RTMP → LL-HLS | 2–5 s | high | medium |
| SRT → HLS | 10–20 s | high | medium |
| WebRTC | < 500 ms | medium | high |
| RTMP → RTMP | 1–5 s | low | low |
Real-time Chat
WebSocket or Firebase Realtime Database / Firestore — both work for stream chat. Firebase Firestore is simpler to integrate for small audiences (up to ~5000 concurrent viewers). For larger — custom WebSocket server (Node.js + Socket.io, Go + Gorilla WebSocket).
Anti-spam and moderation are a separate task: muted users, banned users, rate limiting at backend level.
Assessment and Timeline
Volume depends on architecture, which we determine at requirements analysis. Basic guidelines:
- MVP: RTMP streaming from one platform (iOS or Android) + HLS viewing + basic chat — 2–3 weeks
- Full system: both platforms + LL-HLS/WebRTC + chat moderation + stream recording + VOD — 2–3 months
Cost is calculated individually after analyzing latency, audience, and server infrastructure requirements.







