Human-Centered
Urban Digital Twins
for Behavioral and Spatial Analysis
Cities are built for people — yet we study them without people.
This platform puts humans back into the digital twin.
Urban digital twins have matured into powerful tools for infrastructure monitoring, traffic simulation, and environmental modeling. Yet a fundamental asymmetry persists: these systems are designed to represent cities, not the people who inhabit them. Human wayfinding, behavioral responses to urban interventions, and spatial decision-making under uncertainty remain outside the scope of current digital twin platforms.
This project closes that gap. We propose a human-centered urban digital twin — a research platform that couples an automated geospatial-to-3D-city pipeline with a controlled behavioral experimentation layer. A researcher specifies a district, the system generates a semantically-rich simulation environment from open geodata, live IoT streams ground it in physical reality, and structured experiments study how people navigate, decide, and fail within that environment.
The result is a reproducible, reusable infrastructure that positions geoinformatics as a substrate for behavioral urban science — connecting geospatial modeling, environmental sensing, and human-centered experimentation in a single coherent system accessible to researchers across disciplines.
In a nutshell
"Take any city. Feed its open GIS data into our pipeline and get a simulation-ready 3D environment in return. Then run controlled behavioral experiments inside that environment — with live sensor data, AI crowd agents, and full trajectory logging. That is the platform."
System Layer Overview
Seven layers — two pillars: an automated city model generator, and a behavioral experimentation engine built on top of it.
Ecological validity starts here. The fidelity of the resulting simulation depends directly on the richness and accuracy of the urban datasets ingested at this stage.
- ▸Building footprints & cadastral data
- ▸Road networks, sidewalks & open spaces
- ▸Terrain / elevation models
- ▸Orthophotos & aerial imagery
- ▸Street-level / drone imagery (optional)
- ▸IoT sensor locations & deployment metadata
A fully automated pipeline transforms heterogeneous geodata into a semantically-structured, simulation-ready environment — eliminating the manual 3D modeling bottleneck that currently limits reproducibility in urban simulation research.
- ▸Automated data cleaning & coordinate alignment
- ▸Semantic classification of urban objects
- ▸Procedural 3D building & infrastructure generation
- ▸Navigation mesh & open-space structuring
- ▸Georeferencing preserved throughout
- ▸Direct export to simulation engine
The living model. Not a static 3D render, but a georeferenced, semantically-structured representation of an urban district that can be queried, updated, and manipulated in real time by both the IoT layer and the experiment engine.
- ▸Georeferenced 3D district model
- ▸Semantic urban object graph
- ▸Navigable surface mesh
- ▸Dynamic environmental state
- ▸Points of interest & landmark registry
- ▸Queryable spatial index
Grounds experiments in physical reality. By coupling the twin to live or historically-replayed sensor streams, environmental conditions — crowd density, noise, air quality, lighting — become independent variables rather than fixed assumptions.
- ▸Real-time sensor stream ingestion
- ▸MQTT / REST API support
- ▸Historical replay for reproducibility
- ▸Environmental state updates (thermal, acoustic, atmospheric)
- ▸Spatially-anchored sensor metadata
Implements experimental control. Researchers define tasks, manipulate conditions, and trigger events with the precision of a laboratory — inside a photorealistic urban environment. Between-subjects and within-subjects designs are both supported.
- ▸Structured task & condition definition
- ▸Between- / within-subjects design support
- ▸Event triggers & timed interventions
- ▸Information load manipulation
- ▸Accessibility & safety scenario scripting
- ▸Counterbalancing & randomization
Where experimental control meets ecological validity. Participants navigate a georeferenced, sensor-informed city model — not an abstract maze — with AI agents populating the environment to simulate realistic urban dynamics.
- ▸Desktop mode; optional immersive VR
- ▸Free-roam & task-directed navigation
- ▸Calibrated AI pedestrian agents
- ▸Dynamic signage, alerts & information overlays
- ▸Real-time scenario event injection
- ▸Participant–environment interaction logging
Converts embodied experience into analyzable data. Every spatial decision — including the ones participants did not consciously make — is recorded at high temporal resolution, enabling both hypothesis-driven and exploratory analysis.
- ▸High-resolution movement trajectories
- ▸Decision-point choice logging
- ▸Hesitation dwell time & backtracking metrics
- ▸Task completion times & error rates
- ▸Interaction event logs
- ▸Human vs. AI agent behavioral comparison
- ▸Per-session & aggregate analytics export
Designed for researchers, not engineers. Non-specialist users can build experiments, configure conditions, and export structured datasets without touching the underlying simulation engine or geodata pipeline.
- ▸Experiment & scenario builder (no-code)
- ▸Condition & counterbalancing setup
- ▸Participant & session management
- ▸Live IoT sensor monitoring
- ▸Real-time experiment progress view
- ▸Statistical summary & trajectory visualization
- ▸Structured dataset export (CSV, GeoJSON, HDF5)
Research Scenarios
Each scenario targets a specific research question with manipulable independent variables and measurable behavioral outcomes.
Wayfinding & Spatial Cognition
Do intersection complexity and signage density independently predict route deviation in unfamiliar urban environments?
Systematically vary layout legibility, landmark availability, and signage load while measuring trajectory efficiency, error rate, and decision-point hesitation.
Emergency & Safety Behavior
What environmental cues trigger evacuative routing, and does prior spatial familiarity reduce response latency?
Inject hazard events and route closures mid-experiment to measure behavioral responses — crowd-following, backtracking, freeze-frames — under controlled urgency conditions.
Smart City Interface Evaluation
Do augmented information displays reduce decision time and error rate at complex urban decision nodes?
Deploy and remove dynamic signage, information kiosks, and AR overlays as experimental conditions, measuring their effect on spatial decision quality and cognitive load proxies.
Three Original Claims
Each contribution is stated as a falsifiable claim, not a feature — grounding the proposal in scientific rather than engineering terms.
Geospatial Digital Twin Methodology
We demonstrate that open geodata alone is sufficient to generate ecologically valid, semantically-structured simulation environments — removing a longstanding barrier to reproducible urban simulation research.
- ▸First automated pipeline: open GIS data → experiment-ready 3D city
- ▸Geospatial fidelity preserved end-to-end (CRS, semantics, topology)
- ▸Reproducible model generation across any city with open cadastral data
Human-Centered Urban Science
We provide the first platform that unifies controlled behavioral experimentation with a physically-grounded urban environment — enabling causal inference about human–space interaction at the scale of a city district.
- ▸Experimental designs (within/between-subjects) in photorealistic urban space
- ▸High-resolution behavioral traces beyond survey or GPS data
- ▸Direct comparison of human and synthetic agent behavior
Shared Research Infrastructure
A single platform that serves geoinformatics, HCI, cognitive science, and urban planning simultaneously — reducing the duplicated effort of discipline-specific tooling and enabling genuinely interdisciplinary experiments.
- ▸No-code experiment design accessible to non-technical researchers
- ▸Open data formats and standardized behavioral protocols
- ▸Extensible architecture for custom sensor types, tasks, and metrics
Deliverables
Working platform: geospatial pipeline + behavioral simulation, tested on at least one real urban district
Open-source automated pipeline from municipal geodata to simulation-ready city model
Validated behavioral experiment framework with documented protocols and inter-rater reliability
Publicly archived datasets from at least two completed experiment series
Peer-reviewed publications targeting venues in GIScience, CHI, and Urban Informatics
Applications
Urban research is increasingly demanded by practitioners — planners, public health officials, emergency managers — who need behavioral evidence, not just sensor readings. This platform produces that evidence at a cost and speed no physical field study can match.
- Urban planning & infrastructure design
- Smart city evaluation
- Accessibility analysis
- Safety & emergency planning
- Human-centered urban design
Research Framing