AI sleep quality analysis from sensor data in mobile app

TRUETECH is engaged in the development, support and maintenance of iOS, Android, PWA mobile applications. We have extensive experience and expertise in publishing mobile applications in popular markets like Google Play, App Store, Amazon, AppGallery and others.
Development and support of all types of mobile applications:
Information and entertainment mobile applications
News apps, games, reference guides, online catalogs, weather apps, fitness and health apps, travel apps, educational apps, social networks and messengers, quizzes, blogs and podcasts, forums, aggregators
E-commerce mobile applications
Online stores, B2B apps, marketplaces, online exchanges, cashback services, exchanges, dropshipping platforms, loyalty programs, food and goods delivery, payment systems.
Business process management mobile applications
CRM systems, ERP systems, project management, sales team tools, financial management, production management, logistics and delivery management, HR management, data monitoring systems
Electronic services mobile applications
Classified ads platforms, online schools, online cinemas, electronic service platforms, cashback platforms, video hosting, thematic portals, online booking and scheduling platforms, online trading platforms

These are just some of the types of mobile applications we work with, and each of them may have its own specific features and functionality, tailored to the specific needs and goals of the client.

Showing 1 of 1 servicesAll 1735 services
AI sleep quality analysis from sensor data in mobile app
Complex
~1-2 weeks
FAQ
Our competencies:
Development stages
Latest works
  • image_mobile-applications_feedme_467_0.webp
    Development of a mobile application for FEEDME
    756
  • image_mobile-applications_xoomer_471_0.webp
    Development of a mobile application for XOOMER
    624
  • image_mobile-applications_rhl_428_0.webp
    Development of a mobile application for RHL
    1054
  • image_mobile-applications_zippy_411_0.webp
    Development of a mobile application for ZIPPY
    947
  • image_mobile-applications_affhome_429_0.webp
    Development of a mobile application for Affhome
    862
  • image_mobile-applications_flavors_409_0.webp
    Development of a mobile application for the FLAVORS company
    445

AI-Powered Sleep Quality Analysis from Sensor Data

Wrist accelerometer during sleep writes characteristic patterns for different phases: deep sleep—near total stillness with periodic micromovement, REM—rare movements with rising pulse, wakefulness—clear activity. Model task—correctly label 8 hours of this data.

Raw Data Sources

On iOS, sleep data available via HealthKit as HKCategoryType(.sleepAnalysis). Apple Watch and third-party trackers (Oura, Fitbit, Garmin) record there. But for raw accelerometer and GyroScope data for own ML classification—HealthKit doesn't provide retrospectively. Need background app with CMMotionManager writing data to disk.

class NightMotionRecorder {
    private let motionManager = CMMotionManager()
    private var dataBuffer: [(timestamp: Date, x: Double, y: Double, z: Double)] = []

    func startNightRecording() {
        guard motionManager.isAccelerometerAvailable else { return }
        motionManager.accelerometerUpdateInterval = 1.0 / 25.0  // 25 Hz sufficient for sleep

        motionManager.startAccelerometerUpdates(to: .main) { [weak self] data, error in
            guard let data = data else { return }
            self?.dataBuffer.append((
                timestamp: Date(),
                x: data.acceleration.x,
                y: data.acceleration.y,
                z: data.acceleration.z
            ))
        }
    }
}

25 Hz—balance between accuracy and battery drain. Sleep doesn't need 100 Hz accelerometer. Background mode for CMMotionManager requires UIBackgroundModes: motion in Info.plist—but iOS aggressively terminates background tasks. More reliable: BGProcessingTask for overnight data processing with partial recordings.

SpO2 from HealthKit

Pulse oximetry via HKQuantityType(.oxygenSaturation). Apple Watch Series 6+ writes SpO2 every 1-2 hours at night in background. Drop below 90%—sleep apnea sign. But Apple doesn't write continuous night SpO2 (battery reasons), so intermittent data requires careful interpolation.

Sleep Stage Classification Algorithm

Standard task—4 classes: Wake, Light NREM, Deep NREM (N3), REM. Clinically valid standard—polysomnography (PSG) with EEG. Accelerometer and heart rate sufficient for Wake/Sleep separation with ~85% accuracy, full 4-stage classification—60–75% PSG agreement.

Feature Engineering from Accelerometer

From raw 25 Hz signal per 30-second epochs compute:

  • Activity count—sum of absolute acceleration vector changes (Cole-Kripke algorithm)
  • ZCR (Zero Crossing Rate)—zero-crossing frequency, correlates with fine motor
  • ENMO (Euclidean Norm Minus One)—actigraphy standard, sqrt(x²+y²+z²) - 1g, removes gravity
  • Angle z-axis—wrist angle characteristic for different sleep positions

From HR (if available) add:

  • Resting HR vs current HR (delta)
  • HRV (RMSSD from RR-intervals)—higher in REM than deep sleep

Model

Random Forest on these features gives reasonable baseline. For temporal context—LSTM over RF features: RF outputs feature vector for each 30-sec epoch, LSTM considers epoch sequence. Pattern from Stanford Sleep Lab research.

# Feature extraction for one epoch (30 sec, 750 samples at 25 Hz)
def extract_epoch_features(epoch_data):
    x, y, z = epoch_data[:, 0], epoch_data[:, 1], epoch_data[:, 2]
    enmo = np.maximum(np.sqrt(x**2 + y**2 + z**2) - 1, 0)
    angle_z = np.arctan(z / np.sqrt(x**2 + y**2 + 1e-6)) * 180 / np.pi

    return {
        'enmo_mean': np.mean(enmo),
        'enmo_std': np.std(enmo),
        'enmo_max': np.max(enmo),
        'angle_z_mean': np.mean(angle_z),
        'angle_z_std': np.std(angle_z),
        'activity_count': np.sum(np.abs(np.diff(enmo)))
    }

Model Conversion and Deployment

For iOS: sklearn Random Forest → coremltools.converters.sklearn.convert().mlmodel. LSTM layer—separate CoreML neural net. Combine in CoreML Pipeline. For Android: both models via TFLite with NNAPI delegate for hardware acceleration.

UI: Result Interpretation

Hypnogram (sleep phase timeline chart)—standard display. Additionally: Sleep Score as aggregate metric, breakdown per phase duration, identified patterns (late sleep onset, frequent awakenings).

Tie recommendations to patterns: "Deep sleep N3 share dropped 18% to 9% over 5 nights" + specific advice, not "improve sleep quality".

Development Process

Choose data sources (HealthKit vs raw accelerometer). Develop night data processing pipeline. Feature engineering and classification model training. Conversion and app integration. UI components: hypnogram and recommendation. Test accuracy on real wearable data.

Timeframe Estimates

Wake/Sleep detector with actigraphy and basic UI—1–2 weeks. Full 4-stage classifier with hypnogram and personalized recommendations—3–5 weeks.