Game UX/UI Development
Game UI is not web design and not mobile app design. A shooter's HUD is processed peripherally while under fire. An inventory menu opens for seconds per session but is remembered long-term. Different tasks, different requirements for design and implementation.
Technical pitfalls of game UI
Canvas with Overlay on mobile — classic source of problems. Unity redraws the entire Canvas when any child element changes: coin counter animation rebuilds the batch for the entire HUD every frame. Solution — split Canvas into static and dynamic parts, put Canvas component on frequently updated elements separately.
Scaling for different aspect ratios. Canvas Scaler in Scale With Screen Size mode with Reference Resolution 1920×1080 and Match = 0.5 — standard, but not a cure-all. On iPad (4:3) or foldables with non-standard aspect ratios, UI spreads. RectTransform anchors must be set deliberately for each element, don't rely on defaults.
TextMeshPro and Dynamic Font Atlas. If the game has multiple languages with Cyrillic, Greek, Arabic — one atlas won't suffice. With runtime character addition, Unity rebakes the atlas, creating 2-5 ms freeze. For performance-critical screens, use static atlases with explicitly defined character sets.
How we design game UI
HUD: information without distraction
Good HUD makes information accessible without requiring player focus. We work by ambient information principles: HP bar changes color as it decreases, pulses at critical level — the player reads threat peripherally. We remove numbers where visual state suffices.
For each HUD element we define update frequency: HP changes on each hit, mini-map — every 0.5 sec, timer — each second. Update() with Time.deltaTime accumulator instead of direct update each frame — reduces Canvas rebuild load.
Menu screens and navigation
Navigation via NavigationGraph in Unity UI — for gamepad and keyboard support. People often forget about consoles and Steam Deck where mouse is unavailable. We write explicit navigation for each button in critical screens, don't leave it on automatic.
Transition animations between screens — via DOTween or LeanTween, not Animator for simple cases. Animator overhead for fade-in/fade-out is unjustified. For complex sequence animations (onboarding, cutscene interfaces) — Animator + AnimationEvent for gameplay sync.
Inventory and drag-and-drop
Drag-and-drop in Unity UI is implemented through interfaces IBeginDragHandler, IDragHandler, IEndDragHandler. The main problem — behavior when exiting ScrollRect bounds. If ScrollRect contains draggable elements, we must explicitly manage events through EventSystem.current and separate scrolling and dragging by movement threshold.
Long list virtualization — via Pooling + ScrollRect. Displaying 500 inventory cells as 500 GameObjects is impossible — only visible elements with reuse via ObjectPool<T> (built into Unity since 2021 LTS).
Work process
UX audit or designing from scratch (3-5 days). If UI exists — audit with concrete findings: where users get lost, what's overloaded, what doesn't scale. From scratch — user scenarios, wireframes, information architecture.
Design in Figma (1-2 weeks). Component library for Unity UI: buttons, panels, icons — with state variants (normal, hover, pressed, disabled). Sizes and spacing in dp/Unity units, not pixels.
Implementation (1-3 weeks). Assembly in Unity, animations, gameplay integration. Adaptation for resolutions and aspect ratios.
QA on devices. Essential: iOS (notch safe area), Android (various pixel densities), PC (1280×720, 1920×1080, 2560×1440, ultrawide).
Timelines — from 1 week for simple HUD to 1+ month for complete UI kit with all screens. Cost after analyzing volume and platform requirements.





