Skip to main content

Simulation of Future Street Scenarios in Virtual Reality

·3 mins·
Extended Reality C# Python Unity Publication
Table of Contents

Click here to check out the academic publication associated with this project

City streets today are designed for peak-hour traffic, meaning they stay over-allocated for cars while pedestrians and alternative uses get sidelined. Imagine if our streets could adapt in real-time—expanding walkways, shrinking roadways, or adjusting parking as needed. Sounds futuristic? Well, it’s already happening! Various frameworks have been proposed and some prototypes have even been built.

To explore public perception and acceptance of such designs, we developed an immersive VR simulation that allows users to experience and interact with different street scenarios in a controlled, safe environment:

VR Simulation Details
#

You can find all the details about our experiment with 43 participants and the corresponding results in our publication. Here, I want to focus on how I built the VR simulation inside the Unity game engine together with a colleague of mine.

Car & Pedestrian Agents
#

Image of a red car

A key aspect of the simulation was creating a dynamic urban environment where pedestrians and cars behaved realistically. Both rely on a simple waypoint system with pre-defined paths, where pedestrians use Unity’s NavMesh API for pathfinding. Cars did not require pathfinding as we did not model overtaking of other cars or complex road networks, but rather I implemented a simple steering mechanism with some tweakable parameters for them. I conducted initial tests in this cute low-poly environment:

A very basic simulation with cars following single pre-defined paths and a steady stream of pedestrians walking on the side-walk
A very basic simulation with cars following single pre-defined paths and a steady stream of pedestrians walking on the side-walk

To make vehicle behavior more dynamic, I introduced variability in how each car accelerated, braked, and reached its top speed. Similarly, pedestrian movement was diversified—not only through speed variations but also by allowing deviations from their predefined paths. Unlike pedestrians, however, cars have to remain within their designated lanes to maintain realistic road behavior.

With cars moving at different speeds and pedestrians (including the experiment participant) crossing streets, collision avoidance became crucial. Fortunately, handling pedestrian collisions was straightforward, as the NavMesh API includes built-in collision avoidance. For vehicles, I implemented a sensor system using raycasts directed forward and slightly to the sides. If a raycast detects another car traveling in the same direction or a pedestrian in its path, the vehicle applies a braking force. After a lot of tweaking and customizing this sensor setup, cars were behaving more or less believable, although there are much more realistic (and complex!) traffic models out there (e.g. SUMO) to handle complex traffic networks and different modes of transportation. Lastly, both cars and pedestrians had to be programmed to follow the rules of each street design—whether a conventional traffic light system (status quo) or a dynamically adaptable layout (inLED).

Virtual Environment
#

Image of a regular street crossing
Another large work package was modelling the environment with different street designs (status quo, inLED, curbless) and during multiple daytimes (morning, noon, afternoon, evening night, see the showcase video above), resulting in a total of 15 configurations. I built the environment using a module-based city asset from the Unity store, together with various other assets for cars and props. For pedestrians, we used rigged 3D models from Mixamo.

Bringing it to VR
#

Fortunately, Unity offers great built-in VR support, making the process of transforming a regular 3D project into a VR project effortless. We used a Meta Quest 2 with Air Link, enabling a beefy computer to run the simulation & rendering wirelessly. As our experiment was the first interaction with a VR headset for many participants, we additionally built a simple tutorial environment where we taught participants the basics of navigation. While we had a relatively large physical space for the experiment, it wasn’t sufficient for a full 1:1 mapping of the city environment. Thus, we resorted to supporting movement with the headset’s controllers as well but encouraged natural walking.