Implementing Chatbot Scenario Builder in Mobile Apps
A scenario builder is a tool for creating and editing dialog flows without programming. On mobile it's particularly challenging: limited screen space, no mouse, no keyboard as primary input. Everything through taps on a canvas with nodes and connections.
What a Scenario Builder Represents
A typical bot scenario is a graph: nodes (messages, questions, conditions, actions) and edges (transitions between them). Users create nodes, configure their content, and connect them with arrows. Visually similar to Miro or Figma diagram, just with chatbot domain logic.
Key node types:
- Message — bot sends text/media
- Input — bot waits for user response
- Condition — branching based on condition (response contains word X, variable Y > 0)
- Action — integration with external system (send email, create CRM record)
- GoTo — jump to another node or scenario
Graph Editor Architecture
Implementing a graph editor on mobile is non-trivial. Two main approaches:
WebView-based. Embed web editor (React Flow, JointJS, mxGraph) in WKWebView / WebView. Communication via WKScriptMessageHandler / addJavascriptInterface. Pros: reuse of web version, rich libraries for graphs. Cons: performance on complex graphs, native look&feel lost, bridge overhead.
Native Canvas. iOS — UIScrollView as infinite canvas container, nodes as UIView/CALayer, edges drawn via CAShapeLayer with UIBezierPath. Gestures: UIPanGestureRecognizer for node movement and canvas scrolling, UIPinchGestureRecognizer for zoom, UITapGestureRecognizer for selection. In SwiftUI — Canvas API with GraphicsContext for drawing and Gesture modifiers.
Gesture conflict — most painful problem. Pan for moving node vs pan for scrolling canvas: differentiate via hit-test (view.hitTest(_:with:)) — if gesture started on node, move node; if on empty space, scroll canvas. UIGestureRecognizerDelegate.gestureRecognizer(_:shouldRecognizeSimultaneouslyWith:) for fine-tuning.
For Flutter — use flutter_flow_chart library or custom via CustomPainter + GestureDetector. CustomPainter.paint() called on every redraw, cache Path objects for edges.
Scenario Data Model
Graph serialized to JSON for storage and transmission. Minimal schema:
{
"id": "scenario_123",
"nodes": [
{"id": "n1", "type": "message", "x": 100, "y": 200, "data": {"text": "Hello!"}},
{"id": "n2", "type": "input", "x": 300, "y": 200, "data": {"variable": "user_name"}}
],
"edges": [
{"id": "e1", "source": "n1", "target": "n2", "sourceHandle": "output", "targetHandle": "input"}
]
}
Validate graph before saving: no isolated nodes, no cycles (if not intended), start node defined, all condition-nodes have at least two outgoing edges. Validate on client (instant feedback) and server (data protection).
Node Editing on Mobile
Tap node — opens bottom sheet or modal with editing form. UISheetPresentationController (iOS 15+) with detents: [.medium(), .large()] — user can expand to full screen for long text. In Compose — ModalBottomSheet from Material3.
For message node with rich text support — UITextView with NSAttributedString or TextEditor in SwiftUI. For conditions — DSL-builder: select variable + select operator + enter value through three sequential picker/input.
Canvas Scaling and Navigation
With 50+ nodes on canvas, need minimap — reduced preview of entire graph with viewport indicator. Draw via UIGraphicsImageRenderer or Canvas.drawImage() from canvas thumbnail, update with 200ms debounce on node position change.
Auto-layout: "Organize" button arranges graph via Sugiyama algorithm (layered graph drawing) or simpler topological sort with even distribution across layers. Ready implementations: graphviz via WebAssembly, or simplified version for trees without cycles.
Timeline: 1 week to 3 months. WebView-based editor MVP with basic node types — 1–2 weeks. Full native editor with custom node types, validation, scenario versioning, and dialog testing — 1–3 months.







