Testing and QA
The project is in the final stretch before launch. Everything seems to work—on the developer's machine. Then the first real device test run: on a Samsung Galaxy A12 the game crashes at startup, on an iPad Air 2019 textures look blurry, on a Xiaomi Redmi Note 10 half the animations stutter. This isn't hypothetical—this is the standard state of a mobile project without a structured QA process.
Automated Testing in Unity
The most valuable quality investment is tests that run without human interaction. Unity provides Unity Test Framework (UTF)—a built-in tool based on NUnit.
Edit Mode vs. Play Mode Tests
Edit Mode Tests run without initializing the game loop. Execution time—milliseconds. Suitable for:
- Pure business logic (damage formulas, economy calculations, data validation)
- Utilities and helper systems
- Configuration file parsers
Examples of what makes sense to cover with Edit Mode tests: level progression system, balance formulas, inventory system.
Play Mode Tests run the full game cycle in real time. Slower execution, but let you test:
- Logic dependent on
MonoBehaviourandUpdate() - Coroutines and async operations
- Scene transitions
- System integration (UI + game logic + data)
[UnityTest]
public IEnumerator PlayerTakeDamage_HealthReduces()
{
var go = new GameObject();
var health = go.AddComponent<HealthComponent>();
health.Initialize(100);
health.ApplyDamage(30);
yield return null; // give one frame for update
Assert.AreEqual(70, health.CurrentHealth);
}
Practical rule: test what breaks most often—usually saves, economy, and combat logic. Don't chase 100% coverage; cover critical paths.
UI Testing
Unity UI Toolkit and legacy UGUI are hard to test automatically—most studios stick to manual testing. For basic automation, use InputSystem.QueueEvent to simulate input. For serious UI testing—custom utilities or Appium for mobile.
Performance Profiling
Tests won't catch performance issues—for that you need a profiler. Unity provides several tools, and they must be used together, not separately.
Unity Profiler: CPU and GPU
Unity Profiler — the starting point. Open via Window → Analysis → Profiler or package com.unity.profiling.core. Key steps:
-
Profile on target device, not in editor. Editor adds significant overhead. Connect device via USB and run profiling from Build Settings with Development Build and Autoconnect Profiler enabled.
-
Watch Main Thread and Render Thread separately. Typical issues:
-
Physics.FixedUpdate> 2ms — physics too complex -
Canvas.BuildBatchevery frame — UI rebuilding due to extraDirtycalls -
GC.Collect— garbage collector. Means memory allocation in hot path (insideUpdate()or coroutines)
-
-
Markers help localize problems:
using Unity.Profiling;
static readonly ProfilerMarker k_PathfindMarker = new ProfilerMarker("Pathfinding.Calculate");
void UpdateAI()
{
using (k_PathfindMarker.Auto())
{
// your pathfinding code
}
}
Memory Profiler
Memory Profiler (package com.unity.memoryprofiler) — separate tool for memory analysis. Allows you to:
- Take memory snapshot at specific moment
- Find leaks — objects not freed after scene unload
- Compare two snapshots to detect memory growth
- See what uses most memory (usually textures)
Most common causes of mobile memory problems:
- Textures without proper Compression Format (use ASTC for iOS, ETC2 for Android)
- Audio in WAV format instead of compressed Vorbis
- Objects in
DontDestroyOnLoadthat accumulate between sessions
Frame Debugger
Frame Debugger (Window → Analysis → Frame Debugger) — walk through each draw call in a specific frame. Invaluable for:
- Finding extra draw calls (overdraw)
- Diagnosing batching — why objects don't merge into single draw call
- Shadow and post-processing issues
Norm for mobile projects—50-150 draw calls per frame. Above 300—batching isn't working or scene is overloaded.
Functional and Regression Testing
Test Plans and Checklists
Functional testing is built on test plans—documents describing specific scenarios. For each feature: expected result, reproduction steps, pass criteria.
Regression testing—running accumulated test cases before each release. Goal: ensure new changes didn't break old functionality.
Tools for test case management: TestRail, Qase, Zephyr (for Jira). For small teams, Notion or Google Sheets with structured checklists suffice.
Device Testing
Mobile testing without real devices is incomplete. Emulators don't reproduce memory issues, thermal throttling, real GPU behavior. Minimum device test set:
- Android Low-end: Snapdragon 662 / MediaTek Helio G85, 3GB RAM (Samsung A12, Redmi Note 10)
- Android Mid-range: Snapdragon 720G / 765G, 6GB RAM
- Android Flagship: Snapdragon 888+, 8GB RAM
- iOS: iPhone SE 2 (minimum supported), iPhone 13/14, iPad (latest generation)
For large-scale testing—cloud device farms: Firebase Test Lab (Google Play integration), BrowserStack App Automate, AWS Device Farm. Run automated tests on hundreds of real devices in parallel.
Load Testing
Relevant for multiplayer projects. Goal—find server performance degradation before launch, not after.
Tools:
- k6 — scripted load testing for WebSocket and HTTP API. Well-suited for game backends.
- Gatling — Java/Scala, supports complex stateful scenarios (auth → matchmaking → game session).
- Custom stress client — for specific game protocols, often simpler to write custom stress client in Go or C#.
What to check:
- Behavior at peak CCU (concurrent users)
- Latency degradation under load
- Memory leaks on server during extended runs (72+ hours)
- Graceful degradation behavior—what happens when a node fails
Crash Reporting and Monitoring
After launch, QA continues through monitoring. Without crash reporting, you learn about critical errors from App Store reviews—too late.
- Firebase Crashlytics — standard for mobile games. Auto crash collection with stack trace symbolization, grouping by cause, real-time alerts.
- Sentry — for server components and WebGL. Supports C# and most server languages.
Important: configure ANR (Application Not Responding) reporting separately from crashes—these are hangs without crashes that Android Play Console detects separately.
Common QA Process Mistakes
- Testing only on developer flagship devices. Most players use mid-end and low-end devices.
- No automated tests for economy and saves—these break when refactoring.
- Skipping server load testing. Game passes QA, launches, hits 10k DAU—server goes down.
- Regression testing only before major releases. Run regression before every public update.





