Back to Home
2024 · Product Design UX Research Web App

+.:+: °°

Road observation SAAS for monitoring driving security

RSVP shows one word at a time in a fixed spot, reducing eye movement so you can read faster.May increase cognitive load.

Overview

Visor is a B2B web application built for a regional road maintenance company responsible for monitoring road conditions across an entire region. As the sole designer on a team of four, I led the end-to-end design – from initial concept to a successfully shipped product – while also taking on product strategy, team coordination, and the client pitch.

Timeline: 11 months
Team: 4 people (Designer / PM, Backend Developer, Frontend Developer, DevOps)
My role: UX/UI Design, Product Strategy, User Research, Client Pitch


Problem

The company’s employees are responsible for the safety and condition of road surfaces across hundreds of kilometers. To do their job, they rely on a network of over 100 surveillance cameras placed along highways and regional roads.

Their existing tool was a mobile app with a single flat list of camera names – no search, no filtering, no sorting, no visual context. Finding the right camera meant scrolling through an endless, unstructured list every single time.

In a job where response time matters – spotting icy patches, accidents, or weather damage quickly – every wasted minute searching for a camera is a minute the road goes unmonitored.

The core question: How do we turn a frustrating, time-consuming lookup process into an intuitive, spatially-aware monitoring system?


Research

The challenge of a closed market

Service-grade road monitoring systems are closed to the public for security reasons. There were no competitor products we could simply sign up for and analyze. This meant I had to get creative with research.

I studied publicly available camera viewer applications – traffic cams, city surveillance platforms, weather monitoring tools – and extracted interaction patterns that could translate to our context. I was looking for answers to specific questions: How do you organize 100+ cameras on a single screen? How do you handle clustering without losing context? What metadata matters most at a glance?

Talking to real users

We had direct access to the company’s employees – the people who would actually use this tool daily. Through interviews and observation, several key insights emerged:

Location is everything. Employees think in terms of roads and kilometer markers, not camera IDs. When something happens on highway M-01 at km 130, they need to find that camera instantly – not remember that it’s called “CAM-07-HWY-OBS.”

Weather is part of the job. Road monitoring isn’t just about watching traffic. Employees need to know if it rained, snowed, or if the road surface temperature dropped below zero. Precipitation and temperature data aren’t nice-to-haves – they’re core to the monitoring workflow.

Speed is non-negotiable. The old app was so slow and frustrating that employees had developed workarounds – bookmarking cameras, keeping handwritten lists. Any new solution had to be dramatically faster, not just marginally better.


Solution

Interactive map as the primary interface

The central design decision was replacing the flat list with an interactive map. Every camera is placed at its real geographic location, giving employees an immediate spatial understanding of coverage across the region.

Cameras are clustered by proximity and expand as users zoom in, preventing visual overload while keeping every camera accessible. Each cluster displays aggregated weather data – air temperature and road surface temperature – so employees can scan conditions across the oblast at a glance.

Structured navigation

Not every task starts with a map. Sometimes you know the road – you just need to find the right segment. The sidebar provides an organized, searchable list of all cameras grouped by road designation (M-01, M-02, P-56, etc.) and broken down by kilometer markers and location names.

This dual navigation – map for spatial awareness, list for targeted lookup – means there’s always a fast path to the right camera, regardless of how the employee approaches the task.

Full-screen video with context

Selecting a camera opens a full-screen video feed with everything the employee needs layered around it: live/recorded toggle, playback controls with variable speed, and quick navigation to adjacent cameras on the same road (Previous / Next).

The timeline at the bottom serves a dual purpose – it’s both a video scrubber and a weather history display, showing temperature and conditions over time. This allows employees to quickly scroll back to the moment it started snowing or when temperatures dropped, without switching between separate tools.

Key details visible at all times: connection latency, air temperature, road surface temperature, road designation, kilometer marker, and location name.


Results

Measured against the legacy mobile tool

Time on task

−0%
Reduction

Employees found cameras in roughly half the time thanks to spatial navigation and structured search.

Task success rate

+0%
Increase

More tasks completed correctly on the first attempt due to clearer hierarchy and better information grouping.

User error rate

−0%
Reduction

Fewer wrong cameras opened and misidentified locations because spatial context made it harder to make mistakes.

Projected revenue impact

+0%
Increase

Less time on lookup meant more time on actual monitoring – improving operational efficiency and projected revenue.

Wondering where this huge time difference comes from? Play with the demo and feel it.

Interactive

Find the camera

Each monitoring point has multiple cameras. Find one specific camera – first in a flat list of 101+ entries, then on a map.

Target camera
M-01·km 130 · Ivanivka Center

Reflection

This was the project where my understanding of design shifted from theoretical to practical.

Working as the sole designer while simultaneously handling product strategy and team coordination taught me that good UX doesn’t exist in a vacuum – it lives at the intersection of user needs, technical constraints, and business viability. Every design decision had to survive not just usability testing, but budget discussions and development timelines.

If I were starting this project today, I’d explore integrating AI-powered detection – automatic identification of weather events, road surface conditions, and traffic incidents directly from the camera feeds. The monitoring data is already there; the next evolution is making the system proactive rather than reactive.


Shipped in 2024.