Implementation of Screen Sharing in Mobile Application
Screen sharing on mobile platforms — fundamentally different task than desktop. iOS and Android don't give app direct access to screen buffer. Must use special system mechanisms, each with own limitations.
iOS: ReplayKit and Broadcast Upload Extension
On iOS screen capture works only via ReplayKit. Can't get screen content directly from main app process — sandbox limitation.
For real-time streaming use RPSystemBroadcastPickerView (iOS 12+) showing system picker for broadcast launch. Streaming goes via Broadcast Upload Extension — separate project target running in own process (com.apple.broadcast-services-upload). Extension and main app don't share memory — data exchange only via App Group (shared UserDefaults, shared FileManager, or Darwin notify for signals).
Extension structure:
class SampleHandler: RPBroadcastSampleHandler {
override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer,
with sampleBufferType: RPSampleBufferType) {
switch sampleBufferType {
case .video:
// Pass CMSampleBuffer to WebRTC/Agora/Livekit
videoSource?.capturer(capturer, didCapture: toRTCVideoFrame(sampleBuffer))
case .audioApp:
// System audio
case .audioMic:
// Microphone (available only with explicit Info.plist permission)
}
}
}
Extension has memory limit 50 MB (unlike main app). If connect heavy SDK — exceed limit and get NSExtensionRequestExpiredException without clear error message. Agora and Livekit provide lightweight SDK versions specifically for Broadcast Extension.
Communication extension → main app via CFNotificationCenter (Darwin notifications): extension sends notification, main app establishes WebRTC peer connection and starts relaying frames.
Resolution and FPS limited by ReplayKit: max 1080p, up to 60 FPS. Practice: 720p/15fps sufficient for screen sharing — mobile screens small, lower FPS doesn't hurt text quality but lowers bitrate.
Android: MediaProjection API
On Android screen capture — via MediaProjection API (Android 5.0+). User must explicitly allow recording:
val mediaProjectionManager = getSystemService(MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
startActivityForResult(
mediaProjectionManager.createScreenCaptureIntent(),
REQUEST_MEDIA_PROJECTION
)
In onActivityResult get Intent with permission — from it create MediaProjection. VirtualDisplay with MediaProjection.createVirtualDisplay() gives Surface system draws screen content to.
For WebRTC transmission use ScreenCapturerAndroid (from org.webrtc): wraps MediaProjection and supplies frames to VideoSource.
From Android 10+ on screen capture start via ForegroundService need android:foregroundServiceType="mediaProjection" in manifest. From Android 14 — additionally explicit ServiceInfo.FOREGROUND_SERVICE_TYPE_MEDIA_PROJECTION on service start via startForeground(). Without this — SecurityException on ForegroundService start with mediaProjection on Android 14+.
On connection break MediaProjection.Callback.onStop() called — need handle and notify user, not just log. User can end screen sharing from notification shade and app must correctly transition to mode without sharing.
Frame Transmission in Real-time
Both iOS and Android capture frames are CMSampleBuffer (iOS) or Bitmap/Image via ImageReader (Android). For network transmission via WebRTC convert to RTCVideoFrame with YUV420PlanarBuffer (iOS) or JavaI420Buffer (Android). YUV conversion from BGRA — expensive op, do in native code (C++) or via hardware converter.
For Agora — AgoraRtcKit.startScreenCapture() takes config with contentHint: .text to optimize codec for text content (less motion, high sharpness).
Timeline
iOS (ReplayKit + Broadcast Extension + WebRTC) — 3–5 days with ready signaling. Android (MediaProjection + WebRTC) — 2–3 days. Both platforms with proper lifecycle, background mode and notification UX — 1–1.5 weeks. Cost calculated individually.







