Automated driving is more and more finding its way into our cities. The technology promises more safety, efficiency and comfort in road traffic - both for drivers and passengers. Research into driverless autonomous vehicles still faces significant challenges, particularly in complex urban environments — from a technical point of view and user acceptance. At Ergosign, we design and develop products and services that support and delight people in their work — human-centered at all times. In the STADT:up project, we can combine our expertise from research in "autonomous mobility" with our human-centered design approach and thus make valuable contributions to the future of mobility. In this groundbreaking project, we are working on designing human-centered interaction concepts for the occupants of automated vehicles in everyday traffic situations. Our goal is to address user concerns early, thereby creating a positive and trustworthy user experience.
STADT:up - a visionary research project for autonomous mobility
What is STADT:up?
STADT:up is a comprehensive, multi-year collaborative project with 22 partners from industry and research. The project, which started in January 2023, is funded by the German Bundesministerium für Wirtschaft und Klimaschutz BMWK (Federal Ministry for Economic Affairs and Climate Protection) and will run until December 2025. STADT: up aims to design consistent, scalable solutions for future urban mobility in which vehicles can navigate safely through complex, inner-city traffic situations. The project develops user-centered concepts and pilot applications for automated driving in urban areas. The focus here is on interactions with vulnerable road users (pedestrians and cyclists) and complex situations, especially in conjunction with AI-based methods. In this workshop report, we would like to provide initial insights into our work at STADT:up.
That motivated us, that motivated me. But is this the significant change we've collectively envisioned over the past decade that should bring a better world (for our children and us)? Is the big vision still achievable, and how do we get there?
Human factors and automated driving
STADT:up is structured into five sub-projects. Ergosign is particularly involved in the two sub-projects, „Human Factors“ and „Automated Driving“. In the first sub-project, we are concerned with the design and prototypical development of interaction concepts that support the explanation of automation behavior. The aim is to strengthen vehicle occupants' trust and sense of security in complex inner-city traffic situations. The Automated Driving sub-project focuses on demonstrating these concepts in simulator and real vehicle studies, which we implement in close collaboration with our research partners. We are developing an engineering UI that processes sensor and vehicle information in real-time and thus provides system engineers with valuable information and functions in the development process.
Human factors — milestones:
1. Literature, context and requirements analysis
With a well-founded literature analysis and related work, we lay a solid foundation for the human-centered development of interaction concepts. In addition to a sense of security and fun-to-use, trust in automated vehicles (AVs) is one of the central challenges in user acceptance. The importance of HMIs in the vehicle should not be underestimated here. In autonomous vehicles, these are irreplaceable as an interface between humans and technology to compensate for the absence of human drivers. To address trust and security concerns, especially in critical situations, adequate system feedback is required, which is communicated via the HMI. For example, the sensor-based perception of the vehicle can be visualized in real-time and thus provide explanations of the system behavior.
In addition to the classic literature evaluation, we always keep an eye on the state of technology and the industry. We continuously analyze existing and new HMI and visualization concepts. The focus is primarily on the conceptual and visual aspects of the interfaces under consideration. We sometimes examine concepts from leading AV and AI technology companies, such as Waymo, Cruise, Baidu, or Mobileye, and examine the representation of the environment, routes, or trajectories and the use of different perspectives. We also look at the interaction of system feedback with other vehicles and route information or additional information elements.
In the spirit of the human-centered design process, we not only talk about the (potential) users but also with them. We combined semi-structured interviews with co-creation methods as part of STADT: up. With these "co-creation interviews" we recorded the needs and requirements of future users for the interaction concept. We convert the results into personas and user journeys and derive their targeted design recommendations.
2. Focus scenarios
When developing and evaluating future concepts, we initially focus on four everyday inner-city traffic situations. These scenarios were chosen taking into account the following aspects:
High complexity of the inner-city traffic situation,
Involvement of vulnerable road users (VRU),
Variety of possible situational behaviors,
Frequency of the situation in inner-city traffic.
Using the selected scenarios, the complex challenges of inner-city road traffic and the resulting requirements for (internal) information and communication concepts can be presented, analyzed and discussed.
3. Interaction design: design exploration and initial concepts
We have already transferred the results of a detailed requirements analysis together with the defined scenarios into initial interaction concepts. The concepts developed explain the automation behavior of vehicles understandably and efficiently. Further sub-projects of STADT:up are investigating ways to recognize the intentions of other road users at an early stage. These results will be integrated into our interaction concepts in the future. This way, modular interaction modules are created for internal communication in the vehicle. The modular structure enables relevant information to be distributed to the vehicle occupants depending on the specific context, user group, and active automation level.
Based on the analysis results, the first design ideas are already available in mock-ups and wireframes and will be iteratively developed in the next steps. In the next step, we develop tangible prototypes that are explored and evaluated in simulator and Wizard-of-Oz studies.
Automated driving - milestones:
To test and demonstrate our concepts, we focus in particular on interactions with vulnerable road users. For efficient collaboration with our project partners, the focus is currently on developing an Engineering UI.
The Engineering UI will have the following tasks, among others:
Preparation and visualization of sensor data from partner systems.
Providing suitable functions for data acquisition and system demonstration.
Configuration and demonstration of HMI modules for end users in the test vehicles.
What have we already achieved? So far, we can report the following about these ambitious goals:
1. Requirements analysis
As part of the project kick-off, Ergosign moderated a world café discussion that provided input on the requirements specification for the Engineering UI from the user's perspective. Building on this, detailed framework conditions and requirements were recorded in various joint workshops and meetings. We bundle the results into personas, journey maps, and problem statement maps. These artifacts lay an essential foundation for the successful conception, development, and integration of the (engineering) UIs.
We conducted a comprehensive technology and market analysis of available frameworks as part of the requirements analysis. The Robot Operating System (ROS), the debugging and visualization framework (Foxglove), and the tools for visualizing sensor data and researching the required hardware components were particularly exciting. We are coordinating closely with the cooperation partners such as for example the Intelligent Vehicles Lab the Munich University of Applied Sciences, who are setting up test vehicles as part of STADT:up, in which we will integrate our HMI components.
2. Focus scenarios
Ergosign's scenarios were chosen with a focus on vulnerable road users (see above). For the development and integration of the HMIs/UIs, we will also consider the scenarios of the cooperation partners, in whose test vehicles the HMI concepts will be integrated.
3. Interaction design: initial concepts and test set-up
Based on the (continuous) requirements analysis and coordination with the project partners, we carried out the first ideation workshops. We visualized the resulting ideas and concepts in scribbles and initial wireframes. These were used to coordinate with the developers and project partners in future concept workshops. In the further steps, the concepts are converted into initial prototypes, evaluated, optimized if necessary, and finally implemented. The current focus is the Engineering UI to create a lightweight and compact UI for vehicle engineers, developers, and designers. In addition, the Ergosign developer team has already dealt intensively with the framework conditions for integration into the partner test vehicles and implemented its test set-up.
What is next?
The STADT:up project will run until December 2025 — even though we have already reached some important milestones, we still have some challenges ahead of us. We will regularly share the project's progress here. Stay tuned!
This work is a result of the joint research project STADT:up (Förderkennzeichen 19A22006K). The project is supported by the German Federal Ministry for Economic Affairs and Climate Action (BMWK), based on a decision of the German Bundestag.
In recent times, artificial intelligence (AI) has made a significant impact across various sectors. We are confident that the incorporation of AI into both our own and our client's offerings will progress further. Leveraging our extensive experience and diverse UX portfolio, we eagerly anticipate shaping our future alongside these technological advancements.