1 00:00:00,000 --> 00:00:01,736 2 00:00:01,736 --> 00:00:06,050 I guess we'll start having you introduce yourself. 3 00:00:06,050 --> 00:00:07,060 My name is Reid Simmons. 4 00:00:07,060 --> 00:00:10,750 I'm a research professor at Carnegie Mellon University 5 00:00:10,750 --> 00:00:13,040 in robotics and computer science. 6 00:00:13,040 --> 00:00:24,040 And I came to the AI Lab at MIT in 1979 to do my graduate work. 7 00:00:24,040 --> 00:00:27,220 As an undergraduate, I was very much interested 8 00:00:27,220 --> 00:00:30,107 in computer graphics and that's basically 9 00:00:30,107 --> 00:00:31,690 where I thought I was going to end up. 10 00:00:31,690 --> 00:00:36,310 And I took a course from Stuart Shapiro, who is also 11 00:00:36,310 --> 00:00:41,740 a AAAI fellow now, and just kind of was blown away by the fact 12 00:00:41,740 --> 00:00:45,970 that you could-- 13 00:00:45,970 --> 00:00:47,410 there actually was a field there. 14 00:00:47,410 --> 00:00:50,110 People weren't just playing, making machines 15 00:00:50,110 --> 00:00:50,920 to be intelligent. 16 00:00:50,920 --> 00:00:54,880 But there are actually people out there doing research, 17 00:00:54,880 --> 00:00:56,518 and you can actually have a career 18 00:00:56,518 --> 00:00:57,560 doing that type of thing. 19 00:00:57,560 --> 00:00:59,770 So I say, wow, that's a lot more interesting 20 00:00:59,770 --> 00:01:01,742 than doing computer graphics. 21 00:01:01,742 --> 00:01:03,700 Those were the days when computer graphics were 22 00:01:03,700 --> 00:01:06,310 like Pong, not what it is now. 23 00:01:06,310 --> 00:01:12,970 But I think I picked the field that didn't accelerate 24 00:01:12,970 --> 00:01:15,160 as much as the other one, but it's 25 00:01:15,160 --> 00:01:16,660 been a lot more interesting I think. 26 00:01:16,660 --> 00:01:22,420 27 00:01:22,420 --> 00:01:26,120 So I guess we ended up at the MIT AI 28 00:01:26,120 --> 00:01:28,630 Lab, what kinds of things were going 29 00:01:28,630 --> 00:01:29,970 on there that you remember? 30 00:01:29,970 --> 00:01:31,450 Any cool projects? 31 00:01:31,450 --> 00:01:34,110 32 00:01:34,110 --> 00:01:34,610 People? 33 00:01:34,610 --> 00:01:35,350 Yeah. 34 00:01:35,350 --> 00:01:36,070 There is a bit. 35 00:01:36,070 --> 00:01:39,010 I started out working with-- 36 00:01:39,010 --> 00:01:42,100 for the first year basically, I was 37 00:01:42,100 --> 00:01:44,200 working with Chuck Rich and Dick Waters 38 00:01:44,200 --> 00:01:45,910 and the Programmer's Apprentice. 39 00:01:45,910 --> 00:01:50,740 And what they were trying to do is get a software system 40 00:01:50,740 --> 00:01:55,480 to generate software automatically. 41 00:01:55,480 --> 00:01:59,230 But after the first year, I switched over 42 00:01:59,230 --> 00:02:01,990 to working with Randy Davis. 43 00:02:01,990 --> 00:02:04,660 He just started putting together a project 44 00:02:04,660 --> 00:02:07,730 to do model-based reasoning. 45 00:02:07,730 --> 00:02:11,870 So the big rage up until that time 46 00:02:11,870 --> 00:02:14,870 had really been expert systems, role based systems. 47 00:02:14,870 --> 00:02:19,600 And there was the common understanding 48 00:02:19,600 --> 00:02:22,060 that these systems were very brittle, 49 00:02:22,060 --> 00:02:25,690 because they didn't really capture didn't really 50 00:02:25,690 --> 00:02:26,530 capture knowledge. 51 00:02:26,530 --> 00:02:28,870 They just captured rules of thumb. 52 00:02:28,870 --> 00:02:31,180 And so what we tried to do is develop 53 00:02:31,180 --> 00:02:35,050 systems that had a deeper understanding of whatever 54 00:02:35,050 --> 00:02:38,470 problem they were trying to solve and more general ways 55 00:02:38,470 --> 00:02:39,820 of solving those problems. 56 00:02:39,820 --> 00:02:44,320 And most of his group was working 57 00:02:44,320 --> 00:02:48,885 on analysis of circuits. 58 00:02:48,885 --> 00:02:51,930 59 00:02:51,930 --> 00:02:54,720 Because of my background in graphics, 60 00:02:54,720 --> 00:02:57,580 I was interested more in spatial reasoning. 61 00:02:57,580 --> 00:03:04,590 And he had some contacts with Schlumberger, 62 00:03:04,590 --> 00:03:06,240 who was doing oil exploration. 63 00:03:06,240 --> 00:03:07,800 And they were very much interested 64 00:03:07,800 --> 00:03:08,850 in spatial reasoning. 65 00:03:08,850 --> 00:03:10,470 They were trying to understand what's 66 00:03:10,470 --> 00:03:14,250 happening underneath the ground by looking at the data that 67 00:03:14,250 --> 00:03:17,110 comes back, and trying to figure out what was happening there, 68 00:03:17,110 --> 00:03:21,030 and what processes must have taken place millions of years 69 00:03:21,030 --> 00:03:21,870 ago. 70 00:03:21,870 --> 00:03:24,037 Because they were interested in trying to figure out 71 00:03:24,037 --> 00:03:28,380 where oil was, so it was a nice combination-- 72 00:03:28,380 --> 00:03:32,370 it was a nice combination, my interest 73 00:03:32,370 --> 00:03:37,210 in both model-based reasoning and also spatial reasoning. 74 00:03:37,210 --> 00:03:41,682 So I started working with them on that problem of doing 75 00:03:41,682 --> 00:03:42,765 geological interpretation. 76 00:03:42,765 --> 00:03:48,960 And trying to understand from a picture of the geologic region 77 00:03:48,960 --> 00:03:50,640 what actually occurred. 78 00:03:50,640 --> 00:03:58,350 And this was related to work in planning, basically 79 00:03:58,350 --> 00:03:59,580 interpretation. 80 00:03:59,580 --> 00:04:01,890 Trying to figure out what sequence of events happened 81 00:04:01,890 --> 00:04:03,060 is the opposite. 82 00:04:03,060 --> 00:04:04,800 The inverse of the problem of planning 83 00:04:04,800 --> 00:04:06,960 was just to generate a sequence of events 84 00:04:06,960 --> 00:04:08,380 to make something happen. 85 00:04:08,380 --> 00:04:11,966 So I got involved in the planning committee, 86 00:04:11,966 --> 00:04:16,170 at which this was a time when you a lot of things 87 00:04:16,170 --> 00:04:17,070 were going on there. 88 00:04:17,070 --> 00:04:22,790 89 00:04:22,790 --> 00:04:25,340 Drew McDermott was very much involved 90 00:04:25,340 --> 00:04:26,890 in doing work and planning. 91 00:04:26,890 --> 00:04:30,230 And he was at Yale at the time but had done his work at MIT. 92 00:04:30,230 --> 00:04:35,340 And he was a big influence on the work I've done. 93 00:04:35,340 --> 00:04:44,040 Gerald Sussman, his early work on planning and debugging 94 00:04:44,040 --> 00:04:49,580 was influential when I was working on it. 95 00:04:49,580 --> 00:04:52,790 And then some of the fellow students, 96 00:04:52,790 --> 00:04:54,980 Phil Adrian and Dave Chapman, came along 97 00:04:54,980 --> 00:04:58,640 and turned the whole thing on its ear 98 00:04:58,640 --> 00:05:02,200 by saying, well, intelligence wasn't really planning 99 00:05:02,200 --> 00:05:02,700 any more. 100 00:05:02,700 --> 00:05:08,900 It was just more reacting to the moment. 101 00:05:08,900 --> 00:05:11,870 So it was a time when there was a lot of big arguments 102 00:05:11,870 --> 00:05:18,050 in the AI Lab about model-based reasoning 103 00:05:18,050 --> 00:05:24,350 processes versus more reactive embedded type situations. 104 00:05:24,350 --> 00:05:27,570 105 00:05:27,570 --> 00:05:30,950 For a long time, the two camps were pretty separate. 106 00:05:30,950 --> 00:05:32,870 You're either in one camp or the other. 107 00:05:32,870 --> 00:05:35,900 And now after 20 years have passed, 108 00:05:35,900 --> 00:05:37,880 there's been a lot more blending. 109 00:05:37,880 --> 00:05:41,840 And most people, I think, will agree that neither camp 110 00:05:41,840 --> 00:05:43,920 is right or wrong. 111 00:05:43,920 --> 00:05:45,690 There's aspects of both of them. 112 00:05:45,690 --> 00:05:49,160 And the trick has been to see how we can all put it together. 113 00:05:49,160 --> 00:05:54,320 But in those days, people had very strong opinions of me. 114 00:05:54,320 --> 00:06:00,200 And they really tried to stake out very rigid positions 115 00:06:00,200 --> 00:06:02,584 and in order to make points. 116 00:06:02,584 --> 00:06:03,440 OK. 117 00:06:03,440 --> 00:06:04,430 That make sense. 118 00:06:04,430 --> 00:06:07,570 So I guess where did you head after that, 119 00:06:07,570 --> 00:06:08,980 the next stage in your career? 120 00:06:08,980 --> 00:06:14,930 121 00:06:14,930 --> 00:06:19,130 I had every expectation of continuing on in the area 122 00:06:19,130 --> 00:06:20,570 that I've done my thesis work. 123 00:06:20,570 --> 00:06:27,810 And I was looking for position, and, by chance, 124 00:06:27,810 --> 00:06:30,020 I got a call from Tom Mitchell, who was 125 00:06:30,020 --> 00:06:32,060 at Carnegie Mellon at the time. 126 00:06:32,060 --> 00:06:35,960 And he said that he had just-- 127 00:06:35,960 --> 00:06:39,410 he was involved in a robotics project in which there 128 00:06:39,410 --> 00:06:43,610 was Red Whitaker, who was in charge of building 129 00:06:43,610 --> 00:06:47,660 the robots, Takeo Kanade, who was in charge of doing machine 130 00:06:47,660 --> 00:06:52,370 perception, the vision, and Tom was basically 131 00:06:52,370 --> 00:06:54,530 in charge of making the robot smart. 132 00:06:54,530 --> 00:06:59,570 And he needed someone to come and basically work 133 00:06:59,570 --> 00:07:03,650 as postdoc to do that. 134 00:07:03,650 --> 00:07:07,310 I'd never done any work in robotics. 135 00:07:07,310 --> 00:07:13,100 There was a lot of work in the AI Lab being done there. 136 00:07:13,100 --> 00:07:18,404 Rod Brooks was there, Tomas Lozano-Perez, 137 00:07:18,404 --> 00:07:23,210 and a lot of Brooks's students. 138 00:07:23,210 --> 00:07:30,290 Ian Horswell and John Cannell were doing interesting work 139 00:07:30,290 --> 00:07:32,875 there. 140 00:07:32,875 --> 00:07:34,250 But I had never done any of that. 141 00:07:34,250 --> 00:07:37,940 And in fact, the opposite, I was in the camp 142 00:07:37,940 --> 00:07:40,400 where you do a lot of-- it's a lot of reasoning, 143 00:07:40,400 --> 00:07:43,193 and you think about things, and that's it. 144 00:07:43,193 --> 00:07:45,110 So now it's like you're reasoning and thinking 145 00:07:45,110 --> 00:07:47,330 about things, and you have to actually connect it 146 00:07:47,330 --> 00:07:48,860 to the real world. 147 00:07:48,860 --> 00:07:50,820 And that was a real eye opener. 148 00:07:50,820 --> 00:07:53,390 So that basically completely changed 149 00:07:53,390 --> 00:07:55,550 the nature of my research by actually 150 00:07:55,550 --> 00:07:58,310 having to make robots work. 151 00:07:58,310 --> 00:08:02,720 Rather than just sticking with a very high level reasoning 152 00:08:02,720 --> 00:08:06,230 aspects we had to look at how to actually deal with uncertainty, 153 00:08:06,230 --> 00:08:08,870 how to deal with the fact that the world is really messy. 154 00:08:08,870 --> 00:08:14,880 And as I said, it's helped-- 155 00:08:14,880 --> 00:08:18,960 it brought those two competing factions 156 00:08:18,960 --> 00:08:20,310 closer and closer together. 157 00:08:20,310 --> 00:08:22,620 Because it's clear that to actually do 158 00:08:22,620 --> 00:08:26,130 things in the world, you have to react to the moment 159 00:08:26,130 --> 00:08:28,260 and what's there. 160 00:08:28,260 --> 00:08:30,150 But on the other hand, to be intelligent, 161 00:08:30,150 --> 00:08:32,490 you have to take a step back and look back 162 00:08:32,490 --> 00:08:35,039 at the higher level reasoning. 163 00:08:35,039 --> 00:08:35,850 OK. 164 00:08:35,850 --> 00:08:38,220 That makes a lot of sense. 165 00:08:38,220 --> 00:08:41,450 So I guess I'll ask what some of the most challenging things 166 00:08:41,450 --> 00:08:42,782 you're facing right now are. 167 00:08:42,782 --> 00:08:46,730 168 00:08:46,730 --> 00:08:50,960 So one of the things that I've been interested 169 00:08:50,960 --> 00:08:53,600 in for the last couple of years, and it's kind of consuming 170 00:08:53,600 --> 00:08:57,680 is human robot social interaction. 171 00:08:57,680 --> 00:09:11,810 And the interesting thing is that as I start getting into, 172 00:09:11,810 --> 00:09:16,220 there's more and more evidence that intelligence 173 00:09:16,220 --> 00:09:20,900 and being intelligent and being social are inextricably linked. 174 00:09:20,900 --> 00:09:25,250 So a lot of what we have what we consider intelligence, 175 00:09:25,250 --> 00:09:27,800 like language and communication, are 176 00:09:27,800 --> 00:09:30,950 there more for social reasons than anything else. 177 00:09:30,950 --> 00:09:33,770 And we're intrinsically social creatures. 178 00:09:33,770 --> 00:09:38,960 And you can't separate-- you really 179 00:09:38,960 --> 00:09:41,780 can't separate the social aspects from the intelligence 180 00:09:41,780 --> 00:09:44,970 aspects in many ways. 181 00:09:44,970 --> 00:09:50,780 And it's fascinating because people are very, very good 182 00:09:50,780 --> 00:09:54,170 at both generating and reading social cues, 183 00:09:54,170 --> 00:09:59,840 from gaze, from gesture, from tone of voice, even posture. 184 00:09:59,840 --> 00:10:02,450 185 00:10:02,450 --> 00:10:08,060 If I want to become assertive, I can lean in like this. 186 00:10:08,060 --> 00:10:13,610 And that tells you a lot about what I'm thinking about 187 00:10:13,610 --> 00:10:16,730 and how you should react to that. 188 00:10:16,730 --> 00:10:19,422 And when I back off, It's a lot less threatening 189 00:10:19,422 --> 00:10:20,255 because I'm relaxed. 190 00:10:20,255 --> 00:10:23,360 191 00:10:23,360 --> 00:10:26,390 And our robot systems can't do any of that, in general. 192 00:10:26,390 --> 00:10:30,110 There's some work that's going on in that, 193 00:10:30,110 --> 00:10:33,060 but it's still fairly rudimentary I think. 194 00:10:33,060 --> 00:10:35,740 195 00:10:35,740 --> 00:10:39,290 So the interesting thing is if we 196 00:10:39,290 --> 00:10:44,247 make these systems, these robots in particular, more social, 197 00:10:44,247 --> 00:10:45,830 are they going to be more intelligent, 198 00:10:45,830 --> 00:10:48,260 just by virtue of the fact that they're more social? 199 00:10:48,260 --> 00:10:49,760 That's a really challenging problem. 200 00:10:49,760 --> 00:10:52,520 201 00:10:52,520 --> 00:10:55,340 Something that we're hoping is actually 202 00:10:55,340 --> 00:10:57,680 going to bear a lot of fruit. 203 00:10:57,680 --> 00:10:59,030 That makes sense. 204 00:10:59,030 --> 00:11:01,310 OK. 205 00:11:01,310 --> 00:11:02,463 Let's see. 206 00:11:02,463 --> 00:11:03,630 Do you read science fiction? 207 00:11:03,630 --> 00:11:06,860 208 00:11:06,860 --> 00:11:07,370 Yeah. 209 00:11:07,370 --> 00:11:09,080 I do read science fiction. 210 00:11:09,080 --> 00:11:12,360 What are some of your favorite things you read recently? 211 00:11:12,360 --> 00:11:14,030 Oh. 212 00:11:14,030 --> 00:11:15,320 Let's see. 213 00:11:15,320 --> 00:11:19,948 One of my favorite authors is Orson Scott Card, 214 00:11:19,948 --> 00:11:20,740 the Ender's series. 215 00:11:20,740 --> 00:11:24,320 216 00:11:24,320 --> 00:11:30,340 And more recently, I've become enamored with Philip Dick. 217 00:11:30,340 --> 00:11:35,947 And I've never really had much of an interest in reading, 218 00:11:35,947 --> 00:11:37,405 in general, reading science fiction 219 00:11:37,405 --> 00:11:39,200 that had to do with robots. 220 00:11:39,200 --> 00:11:43,550 But I started reading some of his stuff, 221 00:11:43,550 --> 00:11:45,740 and it's just very engrossing. 222 00:11:45,740 --> 00:11:48,590 223 00:11:48,590 --> 00:11:54,200 Yeah I wouldn't-- a lot of people I talked to say that 224 00:11:54,200 --> 00:11:57,680 either reading science fiction about robots or AI, 225 00:11:57,680 --> 00:12:02,570 or seeing movies like 2001 where the things that kind of drove 226 00:12:02,570 --> 00:12:04,010 them into the field. 227 00:12:04,010 --> 00:12:07,130 And I can honestly say that I don't 228 00:12:07,130 --> 00:12:09,080 think that had anything to do with the fact 229 00:12:09,080 --> 00:12:11,940 that I got involved in this. 230 00:12:11,940 --> 00:12:19,310 It really was just this notion back in the mid '70s that you 231 00:12:19,310 --> 00:12:23,000 could have-- and these days, the computers-- 232 00:12:23,000 --> 00:12:24,740 we had a computer at our university, 233 00:12:24,740 --> 00:12:26,035 which I remember them. 234 00:12:26,035 --> 00:12:30,740 They were going to cost $5 million and had 100K of memory. 235 00:12:30,740 --> 00:12:35,020 It was probably less powerful than the computer in you 236 00:12:35,020 --> 00:12:37,940 cell phone. 237 00:12:37,940 --> 00:12:42,380 And this was what the whole university was using. 238 00:12:42,380 --> 00:12:45,710 And it was incredible to think at that time 239 00:12:45,710 --> 00:12:51,770 that something so relatively slow, compared to today, 240 00:12:51,770 --> 00:12:56,420 was going to be able to emulate the types of capabilities 241 00:12:56,420 --> 00:12:59,505 that units of humans exhibit. 242 00:12:59,505 --> 00:13:01,130 And you really had to take it on faith. 243 00:13:01,130 --> 00:13:02,922 Because a lot of the algorithms that people 244 00:13:02,922 --> 00:13:08,900 were developing would run in minutes rather than in seconds 245 00:13:08,900 --> 00:13:10,100 to make certain decisions. 246 00:13:10,100 --> 00:13:13,610 And you say, this can't be right. 247 00:13:13,610 --> 00:13:16,340 How could this be the way that-- 248 00:13:16,340 --> 00:13:18,470 how could this be a good model for how 249 00:13:18,470 --> 00:13:20,990 humans work if it takes minutes to do 250 00:13:20,990 --> 00:13:22,610 what humans can do in seconds? 251 00:13:22,610 --> 00:13:26,300 And now of course, with computers 1,000 times faster 252 00:13:26,300 --> 00:13:32,180 than 25 years, 30 years ago when I started, 253 00:13:32,180 --> 00:13:35,540 the things that used to take minutes now take milliseconds. 254 00:13:35,540 --> 00:13:38,030 And it's all feasible now. 255 00:13:38,030 --> 00:13:43,610 So it's interesting how the technology 256 00:13:43,610 --> 00:13:48,020 has evolved to the point where a lot of the ideas that people 257 00:13:48,020 --> 00:13:51,620 had years and years ago that other people thought were just 258 00:13:51,620 --> 00:13:57,530 completely infeasible have become 259 00:13:57,530 --> 00:14:00,030 feasible because of technology. 260 00:14:00,030 --> 00:14:02,780 So where do you think we're headed? 261 00:14:02,780 --> 00:14:04,770 What's AI going to do? 262 00:14:04,770 --> 00:14:05,270 Or robotics? 263 00:14:05,270 --> 00:14:09,020 264 00:14:09,020 --> 00:14:13,130 So they've been in academic institutions for a while now. 265 00:14:13,130 --> 00:14:16,190 Is Homer Simpson ever going run into a situation where 266 00:14:16,190 --> 00:14:18,080 he says, I'm talking to a computer 267 00:14:18,080 --> 00:14:22,280 or computers are directly affecting my life as opposed 268 00:14:22,280 --> 00:14:23,450 to-- 269 00:14:23,450 --> 00:14:26,440 for example, expert systems have been in industry and planning 270 00:14:26,440 --> 00:14:27,840 and stuff for a long time. 271 00:14:27,840 --> 00:14:32,120 But they're not really directly interacting-- 272 00:14:32,120 --> 00:14:32,900 With people. 273 00:14:32,900 --> 00:14:33,860 Yeah. 274 00:14:33,860 --> 00:14:34,610 I think so. 275 00:14:34,610 --> 00:14:39,500 This is one of the reasons that drove me into social robots. 276 00:14:39,500 --> 00:14:42,640 I'll give an anecdote from my time at Carnegie Mellon. 277 00:14:42,640 --> 00:14:44,990 We had a robot named Xavier. 278 00:14:44,990 --> 00:14:48,230 279 00:14:48,230 --> 00:14:52,350 We were interested in robust navigation. 280 00:14:52,350 --> 00:14:54,680 So in order to demonstrate robust navigation, 281 00:14:54,680 --> 00:14:57,350 you wanted to navigate for many hours 282 00:14:57,350 --> 00:14:58,950 over long periods of time. 283 00:14:58,950 --> 00:15:01,850 And so what we did was rather than just have 284 00:15:01,850 --> 00:15:05,420 it wander the halls, we had it to be 285 00:15:05,420 --> 00:15:07,520 able to go from room to room. 286 00:15:07,520 --> 00:15:10,375 You could, over the internet, tell-- this was the early '90s. 287 00:15:10,375 --> 00:15:11,750 Over the internet, you could tell 288 00:15:11,750 --> 00:15:14,340 it to go to different places. 289 00:15:14,340 --> 00:15:17,270 And so the robot would basically wander the hall 290 00:15:17,270 --> 00:15:21,050 for four or five hours a day. 291 00:15:21,050 --> 00:15:25,520 And we kept this going for quite a number of months. 292 00:15:25,520 --> 00:15:28,760 And it validated the algorithms that we worked on 293 00:15:28,760 --> 00:15:33,140 but what was also interesting was the social aspects. 294 00:15:33,140 --> 00:15:36,390 You could see how people reacted to the robots. 295 00:15:36,390 --> 00:15:41,420 And these were people who were in a technology University. 296 00:15:41,420 --> 00:15:43,640 They were in a place that did robotics, 297 00:15:43,640 --> 00:15:45,740 but, by and large, they'd never seen a robot. 298 00:15:45,740 --> 00:15:48,230 And they didn't know how to react to the robot. 299 00:15:48,230 --> 00:15:50,810 And you could see that there were 300 00:15:50,810 --> 00:15:52,790 kind of three types of people. 301 00:15:52,790 --> 00:15:54,700 There were the wall huggers. 302 00:15:54,700 --> 00:15:56,450 The robot would come down, and they'd just 303 00:15:56,450 --> 00:15:58,450 stand against the wall like this and wait for it 304 00:15:58,450 --> 00:16:00,710 to pass and let it go through. 305 00:16:00,710 --> 00:16:03,980 And those are the most prevalent actually. 306 00:16:03,980 --> 00:16:06,530 And then there were people who were kind of nonchalant, 307 00:16:06,530 --> 00:16:08,780 who would walk by it normally. 308 00:16:08,780 --> 00:16:11,610 And that was prob-- that was a small number. 309 00:16:11,610 --> 00:16:16,202 And then actually, the smallest number were the antagonist. 310 00:16:16,202 --> 00:16:17,660 They would jump in front of a robot 311 00:16:17,660 --> 00:16:20,660 and harass it and say, if you think 312 00:16:20,660 --> 00:16:22,850 you deserve to be in this corridor with me, 313 00:16:22,850 --> 00:16:24,920 you know you need to get out of my way 314 00:16:24,920 --> 00:16:27,560 or be able to deal with the fact that I'm in your face. 315 00:16:27,560 --> 00:16:29,780 And those are graduate students. 316 00:16:29,780 --> 00:16:33,590 But what was interesting is that even with the people 317 00:16:33,590 --> 00:16:38,330 who are nonchalant about it, there was always-- 318 00:16:38,330 --> 00:16:40,327 you had the situation where you're 319 00:16:40,327 --> 00:16:42,410 trying to go through a doorway and someone else is 320 00:16:42,410 --> 00:16:45,050 coming through the doorway and does the little dance. 321 00:16:45,050 --> 00:16:47,990 And so you go back and forth a few times before you finally 322 00:16:47,990 --> 00:16:49,130 figure out how to pass. 323 00:16:49,130 --> 00:16:50,740 Well this happened all the time. 324 00:16:50,740 --> 00:16:54,260 And it was because the robot didn't know the social rule. 325 00:16:54,260 --> 00:16:56,640 The social rule is you're coming down the corridor, 326 00:16:56,640 --> 00:16:57,620 you go to the right. 327 00:16:57,620 --> 00:16:59,078 The other person goes to the right. 328 00:16:59,078 --> 00:16:59,930 You pass. 329 00:16:59,930 --> 00:17:05,300 And the robot-- because it didn't have any inkling of that 330 00:17:05,300 --> 00:17:08,119 social rule and it was just trying to find the best free 331 00:17:08,119 --> 00:17:09,919 path to go through-- 332 00:17:09,919 --> 00:17:11,900 caused a lot of consternation. 333 00:17:11,900 --> 00:17:13,235 It was fine for the robot. 334 00:17:13,235 --> 00:17:14,170 The robot didn't care. 335 00:17:14,170 --> 00:17:15,170 It didn't have feelings. 336 00:17:15,170 --> 00:17:19,200 It didn't have to get any place any particular time. 337 00:17:19,200 --> 00:17:22,319 But the people were always-- and so a lot of times the way 338 00:17:22,319 --> 00:17:25,472 they would react to it was, rather than trying to out think 339 00:17:25,472 --> 00:17:26,930 the robot, what it was going to do, 340 00:17:26,930 --> 00:17:30,565 they would just stand to this side, let the robot pass, 341 00:17:30,565 --> 00:17:31,190 and then go on. 342 00:17:31,190 --> 00:17:32,607 And that's how they dealt with it. 343 00:17:32,607 --> 00:17:36,260 And it was interesting because they were basically 344 00:17:36,260 --> 00:17:40,760 adapting their behavior to the robot, which is completely 345 00:17:40,760 --> 00:17:41,480 the wrong thing. 346 00:17:41,480 --> 00:17:43,940 If we want to put robots in society, 347 00:17:43,940 --> 00:17:48,020 then the robots need to adapt to humans rather than having 348 00:17:48,020 --> 00:17:49,640 humans adapt to robots. 349 00:17:49,640 --> 00:17:55,850 And I think until we get to that point, where the robots can, 350 00:17:55,850 --> 00:17:59,840 both from a conversational point of view, 351 00:17:59,840 --> 00:18:04,070 use cues like gaze and gesture like we do, 352 00:18:04,070 --> 00:18:05,750 and also from a spatial point of view, 353 00:18:05,750 --> 00:18:09,170 keep personal space, knowing social rules, 354 00:18:09,170 --> 00:18:12,290 knowing things like, well, if you want the robot to ride 355 00:18:12,290 --> 00:18:15,590 an elevator, it needs to wait until people get off, 356 00:18:15,590 --> 00:18:18,750 just like a normal person would. 357 00:18:18,750 --> 00:18:23,990 These days robots that ride elevators 358 00:18:23,990 --> 00:18:25,520 have no social graces. 359 00:18:25,520 --> 00:18:29,300 Basically, the elevator door opens and they go right on. 360 00:18:29,300 --> 00:18:32,700 And it causes consternation in people. 361 00:18:32,700 --> 00:18:37,190 So I think until we can solve that problem of having robots 362 00:18:37,190 --> 00:18:39,890 behave more socially, then there's really 363 00:18:39,890 --> 00:18:43,890 no hope for Homer Simpson to be interacting with the robot 364 00:18:43,890 --> 00:18:45,950 and actually liking it. 365 00:18:45,950 --> 00:18:46,580 Sounds good. 366 00:18:46,580 --> 00:18:48,612 Unfortunately, we're going to have to wrap up. 367 00:18:48,612 --> 00:18:49,570 But thanks for talking. 368 00:18:49,570 --> 00:18:49,750 OK. 369 00:18:49,750 --> 00:18:50,150 Great. 370 00:18:50,150 --> 00:18:50,650 Sure. 371 00:18:50,650 --> 00:18:52,270 My pleasure. 372 00:18:52,270 --> 00:18:55,000