Page 276 - The Design Thinking Playbook
P. 276
EXPERT TIP
Design of “trust” with robots
Trust can be built up and developed in different forms. The simplest example is to give a robot a human
appearance. Machines might come into being in the future that communicate with people and at the same
time make a trustworthy impression on their human interlocutor. The projects of the “Human Centered
Robotics Group,” which has created a robot head that reminds the human interlocutor of a manga girl, are
good examples of this. The creation, based on a schema of childlike characteristics (big eyes), makes an
innocent impression—it makes use of the key stimuli of small children and young animals that emanate
from their proportions (large head, small body). The robot also creates trust because it recognizes who is
speaking to him: It builds eye contact, thus radiating mindfulness. Not only the way a robot should act but
also what it should look like often depends on the cultural context. In Asia, robots are modeled more on
human beings, while they are mechanical objects in Europe. The first American robot was a big tin man.
The first Japanese robot was a big, fat, laughing Buddha.
Description of the robona
Once robots become more similar to human beings, they can be used more flexibly: They help both in
nursing care for the elderly and on construction sites. Trust is created when the robot behaves in a manner
expected by the human being and in particular when the human feels safe due to this behavior. Robots that
do not hurt people in their work—that stop in emergency situations—are trusted. This is the only way they
can interact on a team with people. Both learn, establish trust, and are able to reduce disruptions in the
process. The theme of trust gets more complicated in terms of human–robot activities in different social
systems or when activities are supported by cloud robotics. Then the interface is not represented by big,
trust-inducing eyes but by autonomous helpers that direct and guide us and thus provide us with a basis for
decisions.
275

