Mercurial > hg > Members > atton > intelligence_robotics
annotate slide.md @ 10:62f384a20c2c
fix
author | tatsuki |
---|---|
date | Fri, 26 Jun 2015 09:09:58 +0900 |
parents | 8a6f547b72c0 |
children | 7104d522d2f0 |
rev | line source |
---|---|
9 | 1 title: A Novel Greeting System Selection System for a Culture-Adaptive Humanoid Robot |
0 | 2 author: Tatsuki KANAGAWA <br> Yasutaka HIGA |
3 profile: Concurrency Reliance Lab | |
4 lang: Japanese | |
5 | |
1
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
6 # Abstract: Robots and cultures |
0 | 7 * Robots, especially humanoids, are expected to perform human-like actions and adapt to our ways of communication in order to facilitate their acceptance in human society. |
8 * Among humans, rules of communication change depending on background culture. | |
9 * Greeting are a part of communication in which cultural differences are strong. | |
10 | |
1
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
11 # Abstract: Summary of this paper |
0 | 12 * In this paper, we present the modelling of social factors that influence greeting choice, |
13 * and the resulting novel culture-dependent greeting gesture and words selection system. | |
14 * An experiment with German participants was run using the humanoid robot ARMAR-IIIb. | |
15 | |
1
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
16 # Introduction: Acceptance of humanoid robots |
0 | 17 * Acceptance of humanoid robots in human societies is a critical issue. |
18 * One of the main factors is the relations ship between the background culture of human partners and acceptance. | |
19 * ecologies, social structures, philosophies, educational systems. | |
20 | |
1
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
21 # Introduction: Culture adapted greetings |
0 | 22 * In the work Trovat et al. culture-dependent acceptance and discomfort relating to greeting gestures were found in a comparative study with Egyptian and Japanese participants. |
23 * As the importance of culture-specific customization of greeting was confirmed. | |
24 * Acceptance of robots can be improved if they are able to adapt to different kinds of greeting rules. | |
25 | |
1
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
26 # Introduction: Methods of implementation adaptive behaviour |
0 | 27 * Adaptive behaviour in robotics can be achieved through various methods: |
28 * reinforcement learning | |
29 * neural networks | |
30 * generic algorithms | |
31 * function regression | |
32 | |
1
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
33 # Introduction: Greeting interaction with robots |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
34 * Robots are expected to interact and communicate with humans of different cultural background in a natural way. |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
35 * It is there therefore important to study greeting interaction between robots and humans. |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
36 * ARMAR-III: greeted the Chancellor of Germany with a handshake |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
37 * ASIMO: is capable of performing a wider range of greetings |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
38 * (a handshake, waving both hands, and bowing) |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
39 |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
40 # Introduction: Objectives of this paper |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
41 * The robot should be trained with sociology data related to one country, and evolve its behaviour by engaging with people of another country in a small number of interactions. |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
42 * For the implementation of the gestures and the interaction experiment, we used the humanoid robot ARMAR-IIIb. |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
43 * As the experiment is carried out in Germany, the interactions are with German participants, while preliminary training is done with Japanese data, which is culturally extremely different. |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
44 |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
45 # Introduction: ARMAR-IIIb |
5 | 46 <img src="pictures/ARMAR-IIIb.png" style='width: 350px; height: 350px; margin-left: 200px;'> |
1
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
47 |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
48 # Introduction: Target scenario |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
49 * The idea behind this study is a typical scenario in which a foreigner visiting a country for the first time greets local people in an inappropriate way as long as he is unaware of the rules that define the greeting choice. |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
50 * (e.g., a Westerner in Japan) |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
51 * For example, he might want to shake hands or hug, and will receive a bow instead. |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
52 |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
53 # Introduction: Objectives of this work |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
54 * This work is an application of a study of sociology into robotics. |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
55 * Our contribution is to synthesize the complex and sparse data related to greeting types into a model; |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
56 * create a selection and adaptation system; |
fcf37cf337ea
Wrote slides to Introduction
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
0
diff
changeset
|
57 * and implement the greetings in a way that can potentially be applied to any robot. |
0 | 58 |
2 | 59 # Greeting Selection: Greetings among humans |
60 * Greetings are the means of initiating and closing an interaction. | |
61 * We desire that robots be able to greet people in a similar way to humans. | |
62 * For this reason, understanding current research on greetings in sociological studies is necessary. | |
63 * Moreover, depending on cultural background, there can be different rules of engagement in human-human interaction. | |
64 | |
65 # Greeting Selection: Solution for selection | |
66 * A unified model of greetings does not seem to exist in the literature, but a few studies have attempted a classification of greetings. | |
67 * Some more specific studies have been done on handshaking. | |
68 | |
69 # Greeting Selection: Classes for greetings | |
70 * A classification of greetings was first attempted by Friedman based on intimacy and commonness. | |
71 * The following greeting types were mentioned: smile; wave; nod; kiss on mouth; kiss on cheek; hug; handshake; pat on back; rising; bow; salute; and kiss on hand. | |
72 * Greenbaum et al. also performed a gender-related investigation, while [24] contained a comparative study between Germans and Japanese. | |
73 | |
74 # Greeting Selection: Factors on Classification | |
75 * 'terms' : same terms with different meanings, or different terms with the same meaning. | |
76 * 'location' : influences intimacy and greeting words. (private or public) | |
77 * 'intimacy' : is influenced by physical distance, eye contact, gender, location, and culture. (Social Distance) | |
78 * 'Time' : time of the day is important for the choice of words. | |
79 * 'Politeness', 'Power Relationship', 'culture' and more. | |
80 | |
81 # Greeting Selection: Factors on Classification | |
82 * the factors to be cut are greyed out. | |
83 | |
5 | 84 <img src="pictures/factors.png" style='width: 60%; margin-left: 150px; margin-top: -50px;'> |
85 | |
2 | 86 # Model of Greetings: Assumptions (1 - 5) |
87 * The simplification was guided by the following ten assumptions. | |
88 * Only two individuals (a robot and a human participant): we do not take in consideration a higher number of individuals. | |
89 * Eye contact is taken for granted. | |
90 * Age is considered part of 'power relationship' | |
3
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
91 * Regionally is not considered. |
2 | 92 * Setting is not considered |
93 | |
94 # Model of Greetings: Assumptions (6 - 10) | |
95 * Physical distance is close enough to allow interaction | |
96 * Gender is intended to be a same-sex dyad | |
97 * Affect is considered together with 'social distance' | |
98 * Time since the last interaction is partially included in 'social distance' | |
99 * Intimacy and politeness are not necessary | |
100 | |
3
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
101 # Model of Greetings: Basis of classification |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
102 * Input |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
103 * All the other factors are then considered features of a mapping problem |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
104 * They are categorical data, as they can assume only two or three values. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
105 * Output |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
106 * The outputs can also assume only a limited set of categorical values. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
107 |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
108 # Model of Greetings: Features, mapping discriminants, classes, and possible status |
5 | 109 <img src="pictures/classes.png" style='width: 60%; margin-left: 150px;'> |
3
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
110 |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
111 # Model of Greetings: Overview of the greeting model |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
112 * Greeting model takes context data as input and produces the appropriate robot posture and speech for that input. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
113 * The two outputs evaluated by the participants of the experiment through written questionnaires. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
114 * These training data that we get from the experience are given as feedback to the two mappings. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
115 |
5 | 116 # Model of Greetings: Overview of the greeting model |
117 <img src="pictures/model_overview.png" style='width: 75%; margin-left: 120px;'> | |
118 | |
3
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
119 # Greeting selection system training data |
9 | 120 * Mappings can be trained to an initial state with data taken from the literature of sociology studies. |
3
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
121 * Training data should be classified through some machine learning method or formula. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
122 * We decided to use conditional probabilities: in particular the Naive Bayes formula to map data. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
123 * Naive Bayes only requires a small amount of training data. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
124 |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
125 # Model of Greetings: Details of training data |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
126 * While training data of gestures can be obtained from the literature, data of words can also be obtained from text corpora. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
127 * English: English corpora, such as British National Corpus, or the Corpus of Historical American English, are used. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
128 * Japanese: extracted from data sets by [24, 37, 41-43]. Analyze Corpus on Japanese is difficult. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
129 |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
130 # Model of Greetings: Location Assumption |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
131 * The location of the experiment was Germany. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
132 * For this reason, the only dataset needed was the Japanese. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
133 * As stated in the motivations at the beginning of this paper, the robot should initially behave like a foreigner. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
134 * ARMAR-IIIb, trained with Japanese data, will have to interact with German people and adapt to their customs. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
135 |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
136 # Model of Greetings: Mappings and questionnaires |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
137 * The mapping is represented by a dataset, initially built from training data, as a table containing weights for each context vector corresponding to each greeting type. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
138 * We now need to update these weights. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
139 |
9 | 140 # feedback from three questionnaires |
3
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
141 * Whenever a new feature vector is given as an input, it is checked to see whether it is already contained in the dataset or not. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
142 * In the former case, the weights are directly read from the dataset |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
143 * in the latter case, they get assigned the values of probabilities calculated through the Naive Bayes classifier. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
144 * The output is the chosen greeting, after which the interaction will be evaluated through a questionnaires. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
145 |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
146 # Model of Greetings: Three questionnaires for feedback |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
147 * answers of questionnaires are five-point semantic differential scale: |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
148 1. How appropriate was the greeting chosen by the robot for the current context? |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
149 2. (If the evaluation at point 1 was <= 3) which greeting type would have been appropriate instead? |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
150 3. (If the evaluation at point 1 was <= 3) which context would have been appropriate, if any, for the greeting type of point 1? |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
151 |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
152 # Model of Greetings: feedback and terminate condition |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
153 * Weights of the affected features are multiplied by a positive or negative reward (inspired by reinforcement learning) which is calculated proportionally to the evaluation. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
154 * Mappings stop evolving when the following two stopping conditions are satisfied |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
155 * all possible values of all features have been explored |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
156 * and the moving average of the latest 10 state transitions has decreased below a certain threshold. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
157 |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
158 # Model of Greetings: Summary |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
159 * Thanks to this implementation, mappings can evolve quickly, without requiring hundreds or thousands of iterations |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
160 * but rather a number comparable to the low number of interactions humans need to understand and adapt to social rules. |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
161 |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
162 # TODO: Please Add slides over chapter (3. implementation of ARMAR-IIIb) |
c93a37ff6c79
Wrote slides to section 2
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
2
diff
changeset
|
163 |
9 | 164 # Implementation on ARMAR-IIIb |
165 * ARMAR-III is designed for close cooperation with humans | |
166 * ARMAR-III has a humanlike appearance | |
10 | 167 * sensory capabilities similar to humans |
9 | 168 * ARMAR-IIIb is a slightly modified version with different shape to the head, the trunk, and the hands |
169 | |
170 # Implementation of gestures | |
171 * The implementation on the robot of the set of gestures it is not strictly hardwired to the specific hardware | |
172 * manually defining the patterns of the gestures | |
173 * Definition gesture is performed by Master Motor Map(MMM) format and is converted into robot | |
174 | |
175 # Master Motor Map | |
176 * The MMM is a reference 3D kinematic model | |
177 * providing a unified representation of various human motion capture systems, action recognition systems, imitation systems, visualization modules | |
178 * This representation can be subsequently converted to other representations, such as action recognizers, 3D visualization, or implementation into different robots | |
179 * The MMM is intended to become a common standard in the robotics community | |
10 | 180 |
181 # Master Motor Map | |
9 | 182 <img src="pictures/MMM.png" style='width: 60%; margin-left: 150px; margin-top: -50px;'> |
183 | |
10 | 184 # Master Motor Map |
9 | 185 * The body model of MMM model can be seen in the left-hand illustration in Figure |
186 * It contains some joints, such as the clavicula, which are usually not implemented in humanoid robots | |
187 * A conversion module is necessary to perform a transformation between this kinematic model and ARMAR-IIIb kinematic model | |
10 | 188 |
189 # Master Motor Map | |
9 | 190 <img src="pictures/MMMModel.png" style='width: 60%; margin-left: 150px; margin-top: -50px;'> |
191 | |
192 # converter | |
193 * converter given joint angles would consist in a one-to-one mapping between an observed human subject and the robot | |
10 | 194 * convert is addressed by applying a post-processing procedure in joint angle space |
9 | 195 * the joint angles, given in the MMM format,are optimized concerning the tool centre point position |
196 * solution is estimated by using the joint configuration of the MMM model on the robot | |
197 | |
198 # MMM support | |
199 * The MMM framework has a high support for every kind of human-like robot | |
200 * MMM can define the transfer rules | |
201 * Using the conversion rules, it can be converted from the MMM Model to the movement of the robot | |
202 * may not be able to convert from MMM model for a specific robot | |
203 * the motion representation parts of the MMM can be used nevertheless | |
204 | |
205 # Conversion example of MMM | |
206 * After programming the postures directly on the MMM model they were processed by the converter | |
207 * the human model contains many joints, which are not present in the robot configuration | |
208 * ARMAR is not bending the body when performing a bow | |
209 * It was expressed using a portion present in the robot (e.g., the neck) | |
210 | |
211 | |
212 # GestureExample | |
213 <img src="pictures/GestureExample.png" style='width: 60%; margin-left: 150px; margin-top: -50px;'> | |
214 | |
215 # ImplementGestureARMARⅢ | |
216 <img src="pictures/ImplementGestureARMARⅢ.png" style='width: 60%; margin-left: 150px; margin-top: -50px;'> | |
217 | |
218 # Modular Controller Architecture, a modular software framework | |
219 * The postures could be triggered from the MCA (Modular Controller Architecture, a modular software framework)interface, where the greetings model was also implemented | |
220 * the list of postures is on the left together with the option | |
221 * When that option is activated, it is possible to select the context parameters through the radio buttons on the right | |
10 | 222 |
223 # Modular Controller Architecture, a modular software framework | |
9 | 224 <img src="pictures/MCA.png" style='width: 60%; margin-left: 150px; margin-top: -50px;'> |
225 | |
226 # Implementation of words | |
227 * Word of greeting uses two of the Japanese and German | |
228 * For example,Japan it is common to use a specific greeting in the workplace 「otsukaresama desu」 | |
229 * where a standard greeting like 「konnichi wa」 would be inappropriate | |
230 * In German, such a greeting type does not exist | |
231 * but the meaning of “thank you for your effort” at work can be directly translated into German | |
232 * the robot knows dictionary terms, but does not understand the difference in usage of these words in different contexts | |
233 | |
234 # table of greeting words | |
235 <img src="pictures/tableofgreetingwords.png" style='width: 60%; margin-left: 150px; margin-top: -50px;'> | |
236 | |
237 | |
238 # Implementation of words | |
239 * These words have been recorded through free text-to-speech software into wave files that could be played by the robot | |
240 * ARMAR does not have embedded speakers in its body | |
241 * added two small speakers behind the head and connected them to another computer | |
242 | |
243 # Experiment description | |
10 | 244 * Experiments were conducted at room as shown in Figure , Germany |
245 <img src="pictures/room.png" style='width: 60%; margin-left: 150px; margin-top: 50px;'> | |
9 | 246 |
247 | |
248 # Experiment description2 | |
249 * Participants were 18 German people of different ages, genders, workplaces | |
250 * robot could be trained with various combinations of context | |
251 * It was not possible to include all combinations of feature values in the experiment | |
252 * for example there cannot be a profile with both [‘location’: ‘workplace’] and [‘social distance’: ‘unknown’] | |
253 * the [‘location’:‘private’] case was left out, because it is impossible to simulate the interaction in a private context, such as one’s home | |
254 | |
255 # Experiment description3 | |
256 * repeated the experiment more than | |
257 * for example experiment is repeated at different times | |
258 * Change the acquaintance from unknown social distance at the time of exchange | |
259 * we could collect more data by manipulating the value of a single feature | |
260 | |
261 # Statistics of participants | |
262 * The demographics of the 18 participants were as follows | |
263 1. gender :M: 10; F: 8 | |
264 2. average age: 31.33 | |
265 3. age standard deviation:13.16 | |
266 | |
267 | |
268 # tatistics of participants | |
269 * the number of interactions was determined by the stopping condition of the algorithm | |
270 * The number of interactions taking repetitions into account was 30 | |
271 1. gender :M: 18; F: 12 | |
272 2. average age: 29.43 | |
273 3. age standard deviation: 12.46 | |
274 | |
275 # The experiment protocol is as follows 1~5 | |
276 1. ARMAR-IIIb is trained with Japanese data | |
277 2. encounter are given as inputs to the algorithm and the robot is prepared | |
278 3. Participants entered the room , you are prompted to interact with consideration robot the current situation | |
279 4. The participant enters the room | |
280 5. The robot’s greeting is triggered by an operator as the human participant approaches | |
281 | |
282 # The experiment protocol is as follows 6~10 | |
283 6. After the two parties have greeted each other, the robot is turned off | |
284 7. the participant evaluates the robot’s behaviour through a questionnaire | |
285 8. The mapping is updated using the subject’s feedback | |
286 9. Repeat steps 2–8 for each participant | |
287 10. Training stops after the state changes are stabilized | |
288 | |
289 # Results | |
290 * It referred to how the change in the gesture of the experiment | |
291 * It has become common Bowing is greatly reduced handshake | |
292 * It has appeared hug that does not exist in Japan of mapping | |
293 * This is because the participants issued a feedback that hug is appropriate | |
10 | 294 |
295 # Results | |
9 | 296 <img src="pictures/GestureTable.png" style='width: 60%; margin-left: 150px; margin-top: -50px;'> |
297 | |
10 | 298 # Results |
9 | 299 * The biggest change in the words of the mapping , are gone workplace of greeting |
300 * Is the use of informal greeting as a small amount of change | |
10 | 301 |
302 # Results | |
9 | 303 <img src="pictures/GreetingWordTable.png" style='width: 60%; margin-left: 150px; margin-top: -50px;'> |
304 | |
305 # Limitations and improvements | |
306 * The first obvious limitation is related to the manual input of context data | |
10 | 307 * The integrated use of cameras would make it possible to determine features such as gender, age, and race of the human |
308 | |
309 # Limitations and improvements | |
310 * Robot itself , to determine whether the greeting was correct | |
9 | 311 * Speech recognition system and cameras could also detect the human own greeting |
312 * The decision to check the distance to the partner , the timing of the greeting , head orientation , or to use other information , whether the response to a greeting is correct and what is expected | |
10 | 313 |
314 #Limitations and improvements | |
9 | 315 * It is possible to extend the set of context by using a plurality of documents |
316 | |
317 # Different kinds of embodiment | |
318 * Humanoid robot has a body similar to the human | |
319 * robot can change shape , the size capability | |
320 * Type of greeting me to select the appropriate effect for each robot | |
321 * By expanding this robot , depending on their physical characteristics , it is possible to start discovering interaction method with the best human yourself | |
4
51b87e0db067
Add picture of ARMAR-IIIb
Yasutaka Higa <e115763@ie.u-ryukyu.ac.jp>
parents:
3
diff
changeset
|
322 |
0 | 323 <style> |
324 .slide.cover H2 { font-size: 60px; } | |
325 </style> | |
326 | |
327 <!-- vim: set filetype=markdown.slide: --> |