Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Touch Input

Flint’s input system extends seamlessly to touchscreens through touch zones — named screen regions that map to the same action bindings used by keyboard and gamepad. A single input config can drive a game on desktop and mobile without code changes.

How It Works

Touch input integrates into the existing InputState system in flint-runtime:

Touch event (OS)         Normalized tracking         Action evaluation
TouchStart(id, x, y) ──► TouchPoint { id, pos,  ──► binding_value(TouchZone)
TouchMove(id, x, y)      start_pos, phase,          checks zone containment
TouchEnd(id)              start_time }               produces action values

Touch coordinates are normalized to [0..1] range (0,0 = top-left, 1,1 = bottom-right), making zone definitions resolution-independent. Tap detection uses physics: a touch qualifies as a tap when elapsed time < 300ms and movement distance < 20 pixels.

Touch Zones

Touch zones are named rectangular screen regions. Five built-in zones cover the most common layouts:

ZoneRegionCommon Use
full_screenEntire screenGlobal taps, swipes
left_halfLeft 50%Move left, D-pad left
right_halfRight 50%Move right, D-pad right
top_halfTop 50%Look up, jump
bottom_halfBottom 50%Look down, crouch

Zones are defined as normalized rectangles (x, y, width, height). For example, left_half is (0.0, 0.0, 0.5, 1.0).

Input Configuration

Touch zones use the same InputConfig TOML format as keyboard and gamepad bindings. A single action can have multiple binding types:

# input.toml
version = 1
game_id = "my_game"

[actions.move_left]
kind = "button"
[[actions.move_left.bindings]]
type = "key"
code = "KeyA"
[[actions.move_left.bindings]]
type = "touch_zone"
zone = "left_half"

[actions.move_right]
kind = "button"
[[actions.move_right.bindings]]
type = "key"
code = "KeyD"
[[actions.move_right.bindings]]
type = "touch_zone"
zone = "right_half"

[actions.jump]
kind = "button"
[[actions.jump.bindings]]
type = "key"
code = "Space"

Touch zone bindings support a scale field (default 1.0) that multiplies the action value, just like gamepad axis bindings.

Binding Format

[[actions.my_action.bindings]]
type = "touch_zone"
zone = "left_half"     # Zone name (one of the 5 built-in zones)
scale = 1.0            # Optional: action value multiplier

Mouse-as-Touch Emulation

By default, Flint emulates touch input from mouse clicks on desktop. Left-click-and-drag produces touch events as finger ID 0, letting you test touch-based games without a touchscreen.

  • Enabled by default (emulate_touch_from_mouse = true)
  • Automatically disabled when a real touch event arrives
  • Left mouse button maps to finger 0
  • Mouse position maps to touch position

This means touch-zone bindings work immediately on desktop during development.

Tap Detection

Taps are detected automatically when a touch ends:

  • Duration < 300ms (from touch start to touch end)
  • Distance < 20 pixels (from start position to end position)

Taps are available for one frame after detection and are consumed on read.

Scripting API

Touch state is accessible from Rhai scripts via these functions:

Touch Tracking

FunctionReturnsDescription
touch_count()i64Number of currently active touches
touch_x(index)f64Normalized X position (0–1) of touch at index
touch_y(index)f64Normalized Y position (0–1) of touch at index
is_touching(id)boolWhether the given touch ID is currently active
touch_just_started(id)boolWhether the touch ID just became active this frame
touch_just_ended(id)boolWhether the touch ID just ended this frame

Tap Detection

FunctionReturnsDescription
tap_count()i64Number of taps detected this frame
tap_x(index)f64Normalized X position of tap at index
tap_y(index)f64Normalized Y position of tap at index

Example: Touch-Driven Movement

#![allow(unused)]
fn main() {
// scripts/touch_controller.rhai
fn on_update() {
    let me = self_entity();
    let speed = 5.0;
    let dt = delta_time();

    // Action-based movement works with both keyboard and touch
    if action_held("move_left") {
        let pos = get_position(me);
        set_position(me, pos.x - speed * dt, pos.y, pos.z);
    }
    if action_held("move_right") {
        let pos = get_position(me);
        set_position(me, pos.x + speed * dt, pos.y, pos.z);
    }

    // Direct tap handling for jumping
    if tap_count() > 0 {
        // Jump on any tap
        fire_event("jump");
    }
}

fn on_draw_ui() {
    // Visualize active touches (useful for debugging)
    let w = screen_width();
    let h = screen_height();

    for i in 0..touch_count() {
        let tx = touch_x(i) * w;
        let ty = touch_y(i) * h;
        draw_circle(tx, ty, 20.0, 1.0, 1.0, 1.0, 0.4);
    }
}
}

Design Philosophy

The touch system is intentionally minimal. Rather than providing virtual joysticks, gesture recognizers, or complex multi-touch state machines, it gives you:

  1. Zone-based action bindings — works with the existing input config system
  2. Raw touch state — positions, phases, and tap detection exposed to scripts
  3. Mouse emulation — desktop testing without hardware

Game-specific touch UI (virtual D-pads, on-screen buttons, swipe gestures) belongs in scripts, not the engine. The engine provides the primitives; scripts compose them into the interaction model that fits each game.

Further Reading