1 00:00:00,000 --> 00:00:04,500 2 00:00:04,500 --> 00:00:05,670 My name's Rodney Brooks. 3 00:00:05,670 --> 00:00:07,260 I'm director of the Computer Science 4 00:00:07,260 --> 00:00:14,550 and Artificial Intelligence Lab at MIT, and it's 2006. 5 00:00:14,550 --> 00:00:16,700 So I'll start off with the start-up question. 6 00:00:16,700 --> 00:00:18,600 What got you into the field of AI? 7 00:00:18,600 --> 00:00:21,670 What got you interested? 8 00:00:21,670 --> 00:00:26,050 When I was about eight years old in Australia, 9 00:00:26,050 --> 00:00:28,250 I got this How and Why Wonder book 10 00:00:28,250 --> 00:00:32,470 on giant electronic brains, and I 11 00:00:32,470 --> 00:00:35,320 read that, it introduced the binary system 12 00:00:35,320 --> 00:00:38,330 and had pictures of consoles with flashing lights, 13 00:00:38,330 --> 00:00:40,030 and I knew I wanted to build them. 14 00:00:40,030 --> 00:00:44,140 And then 2001, Space Odyssey came out, 15 00:00:44,140 --> 00:00:46,058 and Marvin Minsky had worked on that 16 00:00:46,058 --> 00:00:48,100 and was mentioned right at the start of the book, 17 00:00:48,100 --> 00:00:51,070 has been inventor of AI. 18 00:00:51,070 --> 00:00:54,610 And that made me want to come to MIT. 19 00:00:54,610 --> 00:01:00,520 And so I spent my childhood in a shed in the back garden 20 00:01:00,520 --> 00:01:03,100 building stuff, building computers, 21 00:01:03,100 --> 00:01:08,290 and trying to build robots, and this disturbed my mother a lot. 22 00:01:08,290 --> 00:01:11,470 She would come down and say, why can't you be like a normal kid 23 00:01:11,470 --> 00:01:12,460 and come and watch TV? 24 00:01:12,460 --> 00:01:15,910 25 00:01:15,910 --> 00:01:17,230 Wow. 26 00:01:17,230 --> 00:01:20,140 So childhood interest. 27 00:01:20,140 --> 00:01:22,690 You came to MIT and-- 28 00:01:22,690 --> 00:01:26,110 I proudly have never been admitted to MIT. 29 00:01:26,110 --> 00:01:31,180 I applied to grad school at MIT, and I did not get admitted. 30 00:01:31,180 --> 00:01:34,420 I got to Stanford and CMU, so I went to the library 31 00:01:34,420 --> 00:01:39,100 and got my atlas and found where Pittsburgh and Palo Alto were, 32 00:01:39,100 --> 00:01:41,000 and Palo Alto seemed nearer to the ocean, 33 00:01:41,000 --> 00:01:43,690 so I went to Stanford and did my PhD there. 34 00:01:43,690 --> 00:01:46,660 Then I applied to MIT for a faculty position, 35 00:01:46,660 --> 00:01:48,980 and I got turned down. 36 00:01:48,980 --> 00:01:50,860 But eventually I got here. 37 00:01:50,860 --> 00:01:52,180 Very nice. 38 00:01:52,180 --> 00:01:57,760 When you first got to MIT, you worked AI Lab? 39 00:01:57,760 --> 00:02:00,580 I first came to MIT in 1981 as a postdoc, 40 00:02:00,580 --> 00:02:05,830 and I was working for Tomas Lozano-Perez with robots, 41 00:02:05,830 --> 00:02:08,710 with robot arms, trying to get them to do assembly. 42 00:02:08,710 --> 00:02:12,820 And we were approaching it very mathematically 43 00:02:12,820 --> 00:02:18,640 and as we tried to put things together with the robots 44 00:02:18,640 --> 00:02:20,980 and do planning, classical planning, 45 00:02:20,980 --> 00:02:23,060 I realized I had to deal with the uncertainty, 46 00:02:23,060 --> 00:02:24,790 because there was tolerances for all the parts, 47 00:02:24,790 --> 00:02:27,165 and there was uncertainty in exactly where the robot was. 48 00:02:27,165 --> 00:02:29,350 So I started trying to model that, 49 00:02:29,350 --> 00:02:33,700 and I developed this whole theory of uncertainty analysis. 50 00:02:33,700 --> 00:02:36,100 And I wrote the world's most boring paper 51 00:02:36,100 --> 00:02:40,317 which was 30 or 40 pages, analyzing uncertainty, 52 00:02:40,317 --> 00:02:41,650 and it just got worse and worse. 53 00:02:41,650 --> 00:02:44,020 And that was trying to put a peg in a hole, 54 00:02:44,020 --> 00:02:47,540 so I sort of realized that that wasn't going to work, 55 00:02:47,540 --> 00:02:49,270 and I had to change things. 56 00:02:49,270 --> 00:02:51,190 Do you mind if I'm not looking at you? 57 00:02:51,190 --> 00:02:51,900 No, that's fine. 58 00:02:51,900 --> 00:02:52,400 OK. 59 00:02:52,400 --> 00:02:54,440 I assume you're going to edit this a little bit here 60 00:02:54,440 --> 00:02:54,940 and there. 61 00:02:54,940 --> 00:02:55,585 Probably, yeah. 62 00:02:55,585 --> 00:02:56,980 Yeah. 63 00:02:56,980 --> 00:02:57,670 It's fine. 64 00:02:57,670 --> 00:02:59,380 Whatever you like. 65 00:02:59,380 --> 00:03:03,600 So then what? 66 00:03:03,600 --> 00:03:05,750 It got worse and worse. 67 00:03:05,750 --> 00:03:08,180 I was out at Stanford on the faculty then, 68 00:03:08,180 --> 00:03:16,250 and I went with my first wife to Thailand, 69 00:03:16,250 --> 00:03:18,530 where she was from, and had spent 70 00:03:18,530 --> 00:03:21,650 a month in this house on stilts in the river 71 00:03:21,650 --> 00:03:24,355 while she talked Thai with all my relatives, 72 00:03:24,355 --> 00:03:26,480 and I just sat there, not being able to communicate 73 00:03:26,480 --> 00:03:31,250 with anyone, watching ants and mosquitoes, lots of mosquitoes. 74 00:03:31,250 --> 00:03:35,888 And they were buzzing around and moving around 75 00:03:35,888 --> 00:03:37,430 quite well in the world, and the ants 76 00:03:37,430 --> 00:03:39,315 were picking things up and moving them. 77 00:03:39,315 --> 00:03:41,690 And I was thinking about how many neurons could they have 78 00:03:41,690 --> 00:03:43,370 and how fast could they compute? 79 00:03:43,370 --> 00:03:46,910 And I figured they couldn't be as good as the mainframe 80 00:03:46,910 --> 00:03:48,830 computers we were using then, so they 81 00:03:48,830 --> 00:03:50,840 must be arranging their computation differently. 82 00:03:50,840 --> 00:03:54,290 They couldn't be doing all this symbolic reasoning 83 00:03:54,290 --> 00:03:57,560 that my planning program was trying to do. 84 00:03:57,560 --> 00:03:59,480 And that got me thinking about how 85 00:03:59,480 --> 00:04:03,000 I could make the system more like an animal system. 86 00:04:03,000 --> 00:04:04,790 And so out of that came-- 87 00:04:04,790 --> 00:04:06,830 eventually, a few months after that, 88 00:04:06,830 --> 00:04:08,840 I came back to MIT on the faculty-- 89 00:04:08,840 --> 00:04:14,320 out of that came what was called the subsumption architecture. 90 00:04:14,320 --> 00:04:16,300 Very nice. 91 00:04:16,300 --> 00:04:18,760 When you first came to MIT, what other kinds of research 92 00:04:18,760 --> 00:04:19,730 were going on? 93 00:04:19,730 --> 00:04:21,110 What else was happening? 94 00:04:21,110 --> 00:04:24,050 95 00:04:24,050 --> 00:04:29,330 When I was here in 1981, it was the year of the robot. 96 00:04:29,330 --> 00:04:30,920 A lot of people came from outside 97 00:04:30,920 --> 00:04:35,360 under some funding from the foundation, 98 00:04:35,360 --> 00:04:37,010 and we really thought robotics was 99 00:04:37,010 --> 00:04:40,380 going to take off in the classical sense of robotics. 100 00:04:40,380 --> 00:04:42,770 And we had the various robot arms, 101 00:04:42,770 --> 00:04:45,650 and a lot of computer vision, and we're 102 00:04:45,650 --> 00:04:48,260 putting this on LISP machines. 103 00:04:48,260 --> 00:04:50,870 And there were about, I don't know, 16 or 18 LISP machines. 104 00:04:50,870 --> 00:04:53,750 And so the game was to get in early in the morning 105 00:04:53,750 --> 00:04:56,040 and sit down at one before other people got here, 106 00:04:56,040 --> 00:04:57,502 so you could hack all day. 107 00:04:57,502 --> 00:04:59,960 But if you left the machine, someone would come and take it 108 00:04:59,960 --> 00:05:00,650 from you. 109 00:05:00,650 --> 00:05:03,950 So you had to sit there the whole day, hacking. 110 00:05:03,950 --> 00:05:09,000 And they had 256K in memory, they were pretty big. 111 00:05:09,000 --> 00:05:13,850 And we were trying to do vision and planning for robots. 112 00:05:13,850 --> 00:05:16,580 We had this idea that we would go straight 113 00:05:16,580 --> 00:05:19,190 from designing things to manufacturing 114 00:05:19,190 --> 00:05:21,450 things with robots directly. 115 00:05:21,450 --> 00:05:25,220 So someone would sit there with a CAD system, design the parts 116 00:05:25,220 --> 00:05:26,930 and have them go together, and out 117 00:05:26,930 --> 00:05:29,570 of that would generate programs that run the robots. 118 00:05:29,570 --> 00:05:34,430 And that was a real force of things coming together 119 00:05:34,430 --> 00:05:37,040 in AI Lab at that time. 120 00:05:37,040 --> 00:05:39,140 At the same time, there was a connection machine 121 00:05:39,140 --> 00:05:41,225 going on Danny Hillis, one of Marvin's students. 122 00:05:41,225 --> 00:05:44,420 123 00:05:44,420 --> 00:05:49,560 And Marvin was still around at that time, 124 00:05:49,560 --> 00:05:57,360 but he was not as active as he had been, I would guess. 125 00:05:57,360 --> 00:05:59,380 Got you. 126 00:05:59,380 --> 00:06:05,470 So subsumption architecture, basically. 127 00:06:05,470 --> 00:06:06,690 Is it on the questions? 128 00:06:06,690 --> 00:06:07,200 Oh, sorry. 129 00:06:07,200 --> 00:06:08,794 Did I miss something? 130 00:06:08,794 --> 00:06:10,750 No, no. 131 00:06:10,750 --> 00:06:14,650 Where did you go after you had this-- 132 00:06:14,650 --> 00:06:21,040 the need to sort of make the machines not 133 00:06:21,040 --> 00:06:22,860 process like a math machine? 134 00:06:22,860 --> 00:06:27,610 135 00:06:27,610 --> 00:06:30,850 Our first robot, Alan, was able to navigate around 136 00:06:30,850 --> 00:06:34,240 in real time with very little computation, actually, 137 00:06:34,240 --> 00:06:40,060 and avoid obstacles and avoid even mobile obstacles, 138 00:06:40,060 --> 00:06:44,050 and that was just maybe four or five years 139 00:06:44,050 --> 00:06:46,120 after the cart at Stanford. 140 00:06:46,120 --> 00:06:49,210 The cart at Stanford had gone 20 meters 141 00:06:49,210 --> 00:06:51,710 in 6 hours in a static environment, 142 00:06:51,710 --> 00:06:54,700 and when the environment wasn't static, it got very confused. 143 00:06:54,700 --> 00:06:59,330 But my robot could go 20 meters in a minute, 144 00:06:59,330 --> 00:07:01,030 so that was quite a big step up. 145 00:07:01,030 --> 00:07:02,800 But it didn't build models of the world, 146 00:07:02,800 --> 00:07:04,690 it just reacted to what was there. 147 00:07:04,690 --> 00:07:07,293 This got a lot of people really angry. 148 00:07:07,293 --> 00:07:08,710 You know, this isn't intelligence, 149 00:07:08,710 --> 00:07:12,040 this is just reactivity, it's horrible. 150 00:07:12,040 --> 00:07:14,270 It will never scale. 151 00:07:14,270 --> 00:07:17,980 And so then rather than trying to make that robot better 152 00:07:17,980 --> 00:07:20,565 and better, I fell into this game 153 00:07:20,565 --> 00:07:21,940 of building more and more robots. 154 00:07:21,940 --> 00:07:24,160 So I started just building robot after robot 155 00:07:24,160 --> 00:07:26,410 and every new graduate student had their own robot. 156 00:07:26,410 --> 00:07:28,280 And we started doing lots of things. 157 00:07:28,280 --> 00:07:36,190 And eventually, in 1988, Colin Angle, who was then in Europe, 158 00:07:36,190 --> 00:07:39,190 Grinnell More, who was the son of Trenchard More, 159 00:07:39,190 --> 00:07:44,140 was at the original AI symposium in Dublin 160 00:07:44,140 --> 00:07:50,690 and was a high school dropout who worked in my lab, he 161 00:07:50,690 --> 00:07:54,950 and Colin and I built the robot Genghis, a six-legged walking 162 00:07:54,950 --> 00:07:55,820 robot. 163 00:07:55,820 --> 00:07:59,780 And whereas other robots had been very slow 164 00:07:59,780 --> 00:08:02,490 and had worried about stability, I'd, again, 165 00:08:02,490 --> 00:08:06,100 looked at insects and videos of insects, 166 00:08:06,100 --> 00:08:08,160 seen how insects didn't maintain stability. 167 00:08:08,160 --> 00:08:09,440 They fell down all the time. 168 00:08:09,440 --> 00:08:12,200 So removing the constraint that the robot had 169 00:08:12,200 --> 00:08:15,920 to stand up all the time just made it incredibly easier 170 00:08:15,920 --> 00:08:21,480 to build a robot which could walk over rough terrain. 171 00:08:21,480 --> 00:08:26,160 And so then when we offered a whole series of walking robots, 172 00:08:26,160 --> 00:08:29,570 and again, people would say oh, well, that's not intelligence. 173 00:08:29,570 --> 00:08:33,320 That's just Rob dealing with his little robots. 174 00:08:33,320 --> 00:08:38,100 So Colin and I and another student, Helen Greiner, 175 00:08:38,100 --> 00:08:40,640 decided well, we're just start a company and show them. 176 00:08:40,640 --> 00:08:42,230 We did that in 1990. 177 00:08:42,230 --> 00:08:44,570 And a mere 15 years later, we were 178 00:08:44,570 --> 00:08:48,690 successful with real robots. 179 00:08:48,690 --> 00:08:49,500 Good deal. 180 00:08:49,500 --> 00:08:50,340 Yeah. 181 00:08:50,340 --> 00:08:51,180 Or sorry. 182 00:08:51,180 --> 00:08:53,700 183 00:08:53,700 --> 00:08:57,405 The idea was 15 years, an overnight success. 184 00:08:57,405 --> 00:09:03,930 185 00:09:03,930 --> 00:09:08,100 Having talked to a bunch of different people in the field, 186 00:09:08,100 --> 00:09:11,220 they say there seems to be a feeling of like, 187 00:09:11,220 --> 00:09:14,340 we need to get common sense reasoning in order 188 00:09:14,340 --> 00:09:16,410 to have a human level intelligence, 189 00:09:16,410 --> 00:09:18,590 and this is the great goal of AI. 190 00:09:18,590 --> 00:09:20,460 That would be Marvin? 191 00:09:20,460 --> 00:09:23,252 That would be Professor Winston who had said that to me. 192 00:09:23,252 --> 00:09:24,696 Marvin, yeah. 193 00:09:24,696 --> 00:09:25,410 Yeah. 194 00:09:25,410 --> 00:09:29,700 I think when we look back, when other people who 195 00:09:29,700 --> 00:09:33,660 are alive in the future look back at what we were doing now, 196 00:09:33,660 --> 00:09:35,100 just as with all sciences, they'll 197 00:09:35,100 --> 00:09:37,788 see a lot of things we're doing and think, well, why were they 198 00:09:37,788 --> 00:09:38,580 worried about that? 199 00:09:38,580 --> 00:09:41,080 That's not the real issue. 200 00:09:41,080 --> 00:09:45,360 So as an example when I started working with walking robots, 201 00:09:45,360 --> 00:09:47,760 a lot of the previous work had been on the gate, 202 00:09:47,760 --> 00:09:50,364 which legs go when? 203 00:09:50,364 --> 00:09:52,080 And when we stopped worrying about that 204 00:09:52,080 --> 00:09:54,280 and just made the robot walk, the gates 205 00:09:54,280 --> 00:09:56,970 had an emerging property of the interactions of the legs 206 00:09:56,970 --> 00:09:58,620 and the environment. 207 00:09:58,620 --> 00:10:01,530 So I think a lot of things that people get upset about 208 00:10:01,530 --> 00:10:04,200 in AI as being big, deep problems 209 00:10:04,200 --> 00:10:07,320 will likely fall out as a side issues. 210 00:10:07,320 --> 00:10:11,340 And I suspect non-monotic reasoning is one of them, 211 00:10:11,340 --> 00:10:13,830 I suspect common sense is one of them. 212 00:10:13,830 --> 00:10:19,500 That will come out of a bunch of other abilities of robots. 213 00:10:19,500 --> 00:10:23,440 And that's my intellectual bit, I may be totally wrong. 214 00:10:23,440 --> 00:10:26,970 But I think it's sort of putting the cart before the horse 215 00:10:26,970 --> 00:10:29,970 to worry about either of those problems at this moment. 216 00:10:29,970 --> 00:10:32,800 When you can't even get a robot to see that's a pen 217 00:10:32,800 --> 00:10:34,350 and pick it up, well, how can I have 218 00:10:34,350 --> 00:10:35,892 any common sense about someone that I 219 00:10:35,892 --> 00:10:38,220 can't even perceive or touch? 220 00:10:38,220 --> 00:10:39,830 That makes sense. 221 00:10:39,830 --> 00:10:42,120 So you're sucker, like I sucked you in. 222 00:10:42,120 --> 00:10:44,820 No, I mean it makes a lot of sense. 223 00:10:44,820 --> 00:10:49,017 224 00:10:49,017 --> 00:10:50,850 Sorry if I keep looking at the camera, but-- 225 00:10:50,850 --> 00:10:52,890 No, it's fine, wherever you want to look. 226 00:10:52,890 --> 00:10:55,400 227 00:10:55,400 --> 00:10:56,550 Lost my train of thought. 228 00:10:56,550 --> 00:10:57,360 Oh, OK. 229 00:10:57,360 --> 00:11:01,890 So human level intelligence, is this still a good goal? 230 00:11:01,890 --> 00:11:05,433 Or is this still more or less the goal? 231 00:11:05,433 --> 00:11:11,060 232 00:11:11,060 --> 00:11:12,980 In some ways, I think we all suffer from what 233 00:11:12,980 --> 00:11:15,230 I call "Star Trek syndrome." 234 00:11:15,230 --> 00:11:20,300 If you watch Star Trek, all the aliens look humanoid. 235 00:11:20,300 --> 00:11:21,590 Why do they look humanoid? 236 00:11:21,590 --> 00:11:25,580 Because that's the actors you can get to play them, 237 00:11:25,580 --> 00:11:28,100 Star Trek being made before computer graphics 238 00:11:28,100 --> 00:11:30,170 was cheap enough to generate them. 239 00:11:30,170 --> 00:11:33,260 And so Hollywood has this terrible lack of imagination 240 00:11:33,260 --> 00:11:36,737 and only dreams up humanoid robots, and I think we, 241 00:11:36,737 --> 00:11:39,320 and I've certainly been guilty of this for the last few years, 242 00:11:39,320 --> 00:11:41,510 dream up humanoid robots, but that's 243 00:11:41,510 --> 00:11:44,360 just one little piece of intelligence, one thing 244 00:11:44,360 --> 00:11:47,030 that evolution stumbled across. 245 00:11:47,030 --> 00:11:49,450 There could be lots of other sorts of intelligences. 246 00:11:49,450 --> 00:11:51,500 We can't agree-- we don't know whether Dolphins 247 00:11:51,500 --> 00:11:53,450 are intelligent or not. 248 00:11:53,450 --> 00:11:56,250 So I can imagine us building all sorts of robots 249 00:11:56,250 --> 00:11:57,570 that do all sorts of things. 250 00:11:57,570 --> 00:11:58,940 And then we'll still be debating, well, they're not 251 00:11:58,940 --> 00:12:00,270 really intelligent, are they? 252 00:12:00,270 --> 00:12:03,020 Meanwhile, it might build some city and rockets 253 00:12:03,020 --> 00:12:04,030 to the moon and stuff. 254 00:12:04,030 --> 00:12:05,822 And we'll say, no, they're not intelligent, 255 00:12:05,822 --> 00:12:07,850 they're just reacting. 256 00:12:07,850 --> 00:12:12,050 So human level intelligence, yes, in principle, 257 00:12:12,050 --> 00:12:14,720 but will actual humans-- 258 00:12:14,720 --> 00:12:16,280 that might be a sideshow. 259 00:12:16,280 --> 00:12:19,220 I think we'll have, over the next 50 years, 260 00:12:19,220 --> 00:12:20,670 more and more robots in our lives. 261 00:12:20,670 --> 00:12:22,790 I can't imagine a future 50 years from now-- 262 00:12:22,790 --> 00:12:25,790 without some disaster having taken place-- 263 00:12:25,790 --> 00:12:30,530 where our everyday lives are not full of robots. 264 00:12:30,530 --> 00:12:33,378 So whether they're humanoid or not, 265 00:12:33,378 --> 00:12:35,420 I tend to think they're not going to be humanoid, 266 00:12:35,420 --> 00:12:37,587 but there's going to be lots of robots in our lives, 267 00:12:37,587 --> 00:12:41,420 just like there's computers everywhere in our lives now. 268 00:12:41,420 --> 00:12:44,030 OK. 269 00:12:44,030 --> 00:12:48,910 When you think of robots, you think of them as tools, 270 00:12:48,910 --> 00:12:50,010 I guess. 271 00:12:50,010 --> 00:12:50,510 [INAUDIBLE] 272 00:12:50,510 --> 00:12:56,840 And eventuality that every home has robots just like today, 273 00:12:56,840 --> 00:12:59,620 computers are everywhere, just tools 274 00:12:59,620 --> 00:13:01,540 to help humans, coworkers? 275 00:13:01,540 --> 00:13:04,530 276 00:13:04,530 --> 00:13:07,090 Yeah. 277 00:13:07,090 --> 00:13:10,060 Cynthia Breazeal, who was a PhD student of mine, 278 00:13:10,060 --> 00:13:11,800 is now in the faculty of the Media Lab, 279 00:13:11,800 --> 00:13:14,100 had this phrase "friend or appliance," 280 00:13:14,100 --> 00:13:17,110 and I think that sort of sums up the ambivalence towards what 281 00:13:17,110 --> 00:13:20,050 these robots will be. 282 00:13:20,050 --> 00:13:22,870 At one level, we just want appliances. 283 00:13:22,870 --> 00:13:25,450 My fridge works 7 days a week, 24 hours a day, 284 00:13:25,450 --> 00:13:29,110 I don't want to have to feel sorry for it working so hard. 285 00:13:29,110 --> 00:13:33,678 So as I build robots, I've got a Roomba and a Scooba in my home 286 00:13:33,678 --> 00:13:34,720 and they clean the floor. 287 00:13:34,720 --> 00:13:38,450 And every day, my Scooba cleans up the dog poop, 288 00:13:38,450 --> 00:13:42,503 and I don't feel bad about the robot cleaning up the dog poop, 289 00:13:42,503 --> 00:13:44,170 and I wouldn't like to have to feel bad. 290 00:13:44,170 --> 00:13:46,660 So maybe I don't want robots that are too intelligent, 291 00:13:46,660 --> 00:13:48,577 because then I'll start identifying with them, 292 00:13:48,577 --> 00:13:51,550 and I'll start feeling bad for them. 293 00:13:51,550 --> 00:13:53,350 Certainly the experience that we've 294 00:13:53,350 --> 00:13:55,720 had with our humanoid robots in the lab 295 00:13:55,720 --> 00:13:59,680 and then my own experience with selling millions of robots 296 00:13:59,680 --> 00:14:02,350 to the general public, a lot of people identify with them 297 00:14:02,350 --> 00:14:04,498 and project onto them. 298 00:14:04,498 --> 00:14:06,040 And some people say well, that's bad, 299 00:14:06,040 --> 00:14:07,390 you're projecting too much. 300 00:14:07,390 --> 00:14:09,520 You're anthropomorphizing the robots too much. 301 00:14:09,520 --> 00:14:11,740 But I think we have antrhopomorphize people, 302 00:14:11,740 --> 00:14:13,300 and that's how we make people people, 303 00:14:13,300 --> 00:14:14,350 anthropomorphizing them. 304 00:14:14,350 --> 00:14:17,650 I don't think it's anything particularly different. 305 00:14:17,650 --> 00:14:20,620 That gets some people really upset. 306 00:14:20,620 --> 00:14:30,130 But we're used to dealing with entities as though they have 307 00:14:30,130 --> 00:14:31,600 free will, and that's what gives us 308 00:14:31,600 --> 00:14:33,640 free will, because we attribute it. 309 00:14:33,640 --> 00:14:36,230 310 00:14:36,230 --> 00:14:39,700 That's interesting. 311 00:14:39,700 --> 00:14:42,650 So I guess we'll step back a little bit. 312 00:14:42,650 --> 00:14:45,970 313 00:14:45,970 --> 00:14:49,810 We've heard a lot of sort of silly anecdotes and fun stories 314 00:14:49,810 --> 00:14:54,340 from people we've talked to about this time at the AI lab 315 00:14:54,340 --> 00:14:57,580 or at different places, I guess, largely 316 00:14:57,580 --> 00:15:01,270 as a function of the unique atmosphere that was there. 317 00:15:01,270 --> 00:15:08,380 Do you have any such recollections or fun stories? 318 00:15:08,380 --> 00:15:12,910 We heard Professor Sussman and then Professor Minsky 319 00:15:12,910 --> 00:15:14,970 we're talking about the same thing, 320 00:15:14,970 --> 00:15:18,160 that apparently is sort of an internet legend about Professor 321 00:15:18,160 --> 00:15:21,430 Sussman wiring a game to play tic-tac-toe and he made 322 00:15:21,430 --> 00:15:23,530 a random net to play and-- 323 00:15:23,530 --> 00:15:25,250 I've heard them both tell the story. 324 00:15:25,250 --> 00:15:25,780 OK. 325 00:15:25,780 --> 00:15:29,050 Yeah, I don't know, just little things like that. 326 00:15:29,050 --> 00:15:44,350 327 00:15:44,350 --> 00:15:46,540 I think for me in Tech Square, the most fun 328 00:15:46,540 --> 00:15:49,720 was just that we were always building robots. 329 00:15:49,720 --> 00:15:52,960 There was the machine shop there up on the ninth floor, 330 00:15:52,960 --> 00:15:55,570 and at all times of day or night-- 331 00:15:55,570 --> 00:15:57,305 you go in there 3:00 AM, 4:00 AM, 332 00:15:57,305 --> 00:15:58,930 some people would be working on robots, 333 00:15:58,930 --> 00:16:00,730 other people would be fixing their cars. 334 00:16:00,730 --> 00:16:05,510 They'd bring half a car in there to fix it. 335 00:16:05,510 --> 00:16:09,370 But it was just the atmosphere that nothing was forbidden, 336 00:16:09,370 --> 00:16:12,220 anything could go, and you could do anything. 337 00:16:12,220 --> 00:16:16,210 And that was a lot of fun. 338 00:16:16,210 --> 00:16:18,370 There'd been a history, by the time I got there, 339 00:16:18,370 --> 00:16:20,740 of building time-sharing systems and operating 340 00:16:20,740 --> 00:16:23,240 systems and things like that. 341 00:16:23,240 --> 00:16:25,180 But by the time I was there, it was all 342 00:16:25,180 --> 00:16:28,220 about building robots and putting 343 00:16:28,220 --> 00:16:30,010 embedded processors together and lots 344 00:16:30,010 --> 00:16:31,780 of wires and connectors and batteries 345 00:16:31,780 --> 00:16:35,050 and making the robots do stuff. 346 00:16:35,050 --> 00:16:39,850 Actually, I think that probably the moment that 347 00:16:39,850 --> 00:16:43,900 was the single most stunning moment to me 348 00:16:43,900 --> 00:16:48,820 was with the first robot, how I had 349 00:16:48,820 --> 00:16:51,470 come from this environment of trying 350 00:16:51,470 --> 00:16:54,370 to do classical planning, and so we'd always 351 00:16:54,370 --> 00:16:57,130 done things in simulation where we all simulated robots 352 00:16:57,130 --> 00:16:58,520 and simulated graphics robots. 353 00:16:58,520 --> 00:17:01,060 So when I started building my first mobile robot 354 00:17:01,060 --> 00:17:08,869 at MIT in the AI Lab in late 1984 into 1985, 355 00:17:08,869 --> 00:17:13,390 when I was a very junior assistant professor, 356 00:17:13,390 --> 00:17:18,160 I, of course, built a 2D simulation on a LISP machine-- 357 00:17:18,160 --> 00:17:20,410 by that time we had a new generation of LISP machines, 358 00:17:20,410 --> 00:17:22,618 and I got one of the old ones and had it in my office 359 00:17:22,618 --> 00:17:23,710 all the time-- 360 00:17:23,710 --> 00:17:26,369 and I'd gotten the whole control system 361 00:17:26,369 --> 00:17:29,005 to work on the simulation, the little circle robot 362 00:17:29,005 --> 00:17:31,120 would move around in the graphics. 363 00:17:31,120 --> 00:17:32,920 And then I had the physical robot there 364 00:17:32,920 --> 00:17:34,682 and I was working late at night, night 365 00:17:34,682 --> 00:17:36,640 after night, trying to get everything debugged, 366 00:17:36,640 --> 00:17:38,860 get all the commands under the robot. 367 00:17:38,860 --> 00:17:42,840 And time and after again, I'd get everything running, 368 00:17:42,840 --> 00:17:46,360 switch it on and download and run, and nothing would happen. 369 00:17:46,360 --> 00:17:48,485 And then I'd go poking the memory and find out, ah, 370 00:17:48,485 --> 00:17:50,110 this thing's wrong, that thing's wrong. 371 00:17:50,110 --> 00:17:52,390 It had been going on for days and days like that. 372 00:17:52,390 --> 00:17:59,400 And then I switched it on, just like I had maybe 30 or 40 times 373 00:17:59,400 --> 00:18:01,650 over the last few days, and this time, it moved. 374 00:18:01,650 --> 00:18:03,180 The thing actually moved. 375 00:18:03,180 --> 00:18:07,080 Instead of the simulation, the robot was actually working. 376 00:18:07,080 --> 00:18:09,780 And that was just the greatest feeling. 377 00:18:09,780 --> 00:18:12,245 It was alive. 378 00:18:12,245 --> 00:18:14,370 And I think that's what, when I was eight years old 379 00:18:14,370 --> 00:18:16,440 in Australia with the flashing lights, that's what I've 380 00:18:16,440 --> 00:18:17,820 been after, making it alive. 381 00:18:17,820 --> 00:18:20,400 And Alan was alive. 382 00:18:20,400 --> 00:18:22,590 So that, to me, was the most fun. 383 00:18:22,590 --> 00:18:25,380 I was all by myself at the time, but it was just the best 384 00:18:25,380 --> 00:18:26,940 experience ever. 385 00:18:26,940 --> 00:18:27,990 What did you have it do? 386 00:18:27,990 --> 00:18:29,220 What was the first thing? 387 00:18:29,220 --> 00:18:31,890 Was it just wandering around? 388 00:18:31,890 --> 00:18:36,150 Alan was a layered architecture, and its lowest level layer 389 00:18:36,150 --> 00:18:37,060 was to avoid stuff. 390 00:18:37,060 --> 00:18:38,470 So when it finally moved-- 391 00:18:38,470 --> 00:18:40,380 it was right next to the bench, it 392 00:18:40,380 --> 00:18:43,140 just ran away and went out in the middle of the room 393 00:18:43,140 --> 00:18:46,500 and sat there, because that was away from everything. 394 00:18:46,500 --> 00:18:48,300 Nice. 395 00:18:48,300 --> 00:18:50,580 You talked a little about Star Trek syndrome. 396 00:18:50,580 --> 00:18:54,240 I was just curious whether you read science fiction. 397 00:18:54,240 --> 00:19:00,630 I devoured science fiction when I was a boy, Arthur C Clarke, 398 00:19:00,630 --> 00:19:04,530 Isaac Asimov, both of whom knew Marvin well and further-- 399 00:19:04,530 --> 00:19:09,580 Marvin Minsky-- and Robert Heinlein, and those authors. 400 00:19:09,580 --> 00:19:12,660 So I read a lot of that-- and then some British authors too, 401 00:19:12,660 --> 00:19:17,660 Captain WE Johns, I think his name was. 402 00:19:17,660 --> 00:19:23,940 And I really devoured that, but after I was in my 20s, 403 00:19:23,940 --> 00:19:25,890 I didn't really read science fiction anymore, 404 00:19:25,890 --> 00:19:29,820 because I was too busy building stuff and having fun. 405 00:19:29,820 --> 00:19:31,250 That makes sense. 406 00:19:31,250 --> 00:19:33,340 You don't read it now? 407 00:19:33,340 --> 00:19:33,840 No. 408 00:19:33,840 --> 00:19:35,450 Do you read fiction now? 409 00:19:35,450 --> 00:19:36,580 No. 410 00:19:36,580 --> 00:19:37,350 No. 411 00:19:37,350 --> 00:19:38,300 Fair enough. 412 00:19:38,300 --> 00:19:40,230 No, I've heard this response a lot of times. 413 00:19:40,230 --> 00:19:42,750 414 00:19:42,750 --> 00:19:45,180 I guess that'd probably about do it. 415 00:19:45,180 --> 00:19:50,500 Do you have anything else you wanted to talk about? 416 00:19:50,500 --> 00:19:52,340 OK, tell one little thing. 417 00:19:52,340 --> 00:19:54,100 Sure. 418 00:19:54,100 --> 00:19:55,810 People often ask me about the future, 419 00:19:55,810 --> 00:19:58,320 and they ask me about the future in two different ways. 420 00:19:58,320 --> 00:20:01,450 One way they ask me, well, you know, 421 00:20:01,450 --> 00:20:04,233 we saw such and such a movie, is that what's going to happen? 422 00:20:04,233 --> 00:20:05,650 And I keep having to remind people 423 00:20:05,650 --> 00:20:07,870 that movies are not predictors of the future, 424 00:20:07,870 --> 00:20:11,860 because if it was, we'd all be fighting aliens on the streets 425 00:20:11,860 --> 00:20:13,570 and there'd be ghosts everywhere. 426 00:20:13,570 --> 00:20:18,010 And Hollywood movies about the future of robots 427 00:20:18,010 --> 00:20:20,230 are normally pretty dumb, because what Hollywood 428 00:20:20,230 --> 00:20:24,040 does in those cases is take a world just as it is today 429 00:20:24,040 --> 00:20:26,890 and change one thing, put an emotional robot 430 00:20:26,890 --> 00:20:29,300 or a humanoid robot amongst everything as it is today. 431 00:20:29,300 --> 00:20:32,360 But the reality is everything changes all the time. 432 00:20:32,360 --> 00:20:34,660 So our world is so much different 433 00:20:34,660 --> 00:20:36,945 from what it was 10 or 15 years ago. 434 00:20:36,945 --> 00:20:39,220 We now Google everything, we've got everything 435 00:20:39,220 --> 00:20:40,000 at our fingertips. 436 00:20:40,000 --> 00:20:41,630 We didn't have that 10 or 15 years ago. 437 00:20:41,630 --> 00:20:43,630 The world has changed, the amount of information 438 00:20:43,630 --> 00:20:44,470 we have has changed. 439 00:20:44,470 --> 00:20:46,170 We can't fool people. 440 00:20:46,170 --> 00:20:48,670 You can fool some people, but you can't fool a lot of people 441 00:20:48,670 --> 00:20:50,230 as easily as we could. 442 00:20:50,230 --> 00:20:53,290 So the world is not going to be like Hollywood. 443 00:20:53,290 --> 00:20:56,270 But people ask me, are we going to have robots? 444 00:20:56,270 --> 00:21:00,490 And so the game I like to play, I always choose 50 years. 445 00:21:00,490 --> 00:21:04,570 They say, should we work on x, will we have x? 446 00:21:04,570 --> 00:21:07,780 I think about 50 years from now, and can I 447 00:21:07,780 --> 00:21:11,300 imagine x not being true? 448 00:21:11,300 --> 00:21:14,890 So for instance, will we have teleportation? 449 00:21:14,890 --> 00:21:17,560 50 years from now, will we have teleportation? 450 00:21:17,560 --> 00:21:19,630 I can certainly imagine that we won't 451 00:21:19,630 --> 00:21:21,790 have teleportation in 50 years. 452 00:21:21,790 --> 00:21:24,070 So to me, that says it's not worth trying 453 00:21:24,070 --> 00:21:25,465 to work on teleportation. 454 00:21:25,465 --> 00:21:26,215 Because who knows? 455 00:21:26,215 --> 00:21:28,632 Someone will have to invent some weird new physics and who 456 00:21:28,632 --> 00:21:30,520 knows whether that will happen? 457 00:21:30,520 --> 00:21:33,460 Will we have robots helping the elderly 458 00:21:33,460 --> 00:21:35,260 in their homes in 50 years from now? 459 00:21:35,260 --> 00:21:38,270 I can't imagine not. 460 00:21:38,270 --> 00:21:41,470 I can't imagine that a vacuum cleaner in 50 years 461 00:21:41,470 --> 00:21:42,310 won't be a robot. 462 00:21:42,310 --> 00:21:45,010 Right now, 1% of the vacuum cleaners are robots. 463 00:21:45,010 --> 00:21:47,020 In 50 years, every vacuum cleaner 464 00:21:47,020 --> 00:21:50,230 will be a robot, just like every vacuum cleaner is now 465 00:21:50,230 --> 00:21:53,570 electrified whereas they weren't 50 years ago. 466 00:21:53,570 --> 00:21:56,200 So when I say, well, if it's going to be true in 50 years, 467 00:21:56,200 --> 00:21:59,560 now it's just a matter of deciding well, 468 00:21:59,560 --> 00:22:02,290 is it really only 20, 30 years, or 20 years, or 5 years? 469 00:22:02,290 --> 00:22:05,050 And what do I have to work on to make that true? 470 00:22:05,050 --> 00:22:07,720 And that tells me what the interesting questions are 471 00:22:07,720 --> 00:22:11,560 to go after, because it's within people's lifetimes 472 00:22:11,560 --> 00:22:15,340 that this can happen, and we just have to make it happen. 473 00:22:15,340 --> 00:22:15,920 I lied. 474 00:22:15,920 --> 00:22:21,100 I have one more question because you reminded me of it. 475 00:22:21,100 --> 00:22:24,420 Say I'm a researcher who is just entering the field of AI. 476 00:22:24,420 --> 00:22:27,850 477 00:22:27,850 --> 00:22:30,550 Or, I don't know, just say I just got accepted at MIT 478 00:22:30,550 --> 00:22:32,620 and I want to go do AI. 479 00:22:32,620 --> 00:22:33,670 What do you say? 480 00:22:33,670 --> 00:22:34,660 You got any advice? 481 00:22:34,660 --> 00:22:36,935 Do you have any don't do this? 482 00:22:36,935 --> 00:22:39,080 You have any I don't care? 483 00:22:39,080 --> 00:22:43,860 484 00:22:43,860 --> 00:22:47,820 I think there are four big issues 485 00:22:47,820 --> 00:22:53,970 that if we make progress on, we will help robot-kind to exist. 486 00:22:53,970 --> 00:22:56,760 487 00:22:56,760 --> 00:23:01,313 Back 40 years ago, Gerry Sussman was assigned the task-- 488 00:23:01,313 --> 00:23:02,730 when he was a sophomore, I think-- 489 00:23:02,730 --> 00:23:05,788 of solving the object recognition problems. 490 00:23:05,788 --> 00:23:07,080 The object recognition problem. 491 00:23:07,080 --> 00:23:09,510 A two-year-old walking into this room, I gave him this, 492 00:23:09,510 --> 00:23:11,240 they'll say that's a pen. 493 00:23:11,240 --> 00:23:13,140 They'll say this is a sheet of paper. 494 00:23:13,140 --> 00:23:14,520 They'll say that this is a table. 495 00:23:14,520 --> 00:23:16,650 They can recognize all these objects never 496 00:23:16,650 --> 00:23:18,550 having seen them before. 497 00:23:18,550 --> 00:23:21,895 So the object recognition capabilities of a two-year-old, 498 00:23:21,895 --> 00:23:24,000 an unsolved problem. 499 00:23:24,000 --> 00:23:29,070 Gerry Sussman failed in 1966, my PhD in 1981 failed on that-- 500 00:23:29,070 --> 00:23:32,610 lots of failures, but that's a big issue. 501 00:23:32,610 --> 00:23:36,570 Second issue, the language capabilities 502 00:23:36,570 --> 00:23:37,900 of a four-year-old child. 503 00:23:37,900 --> 00:23:40,770 A four-year-old child can talk with counterfactuals 504 00:23:40,770 --> 00:23:44,940 and has linguistic capabilities that would enable, 505 00:23:44,940 --> 00:23:47,310 if a robot had them, we could get the robot 506 00:23:47,310 --> 00:23:49,890 to do almost anything and have the robot explain to us 507 00:23:49,890 --> 00:23:51,630 almost anything. 508 00:23:51,630 --> 00:23:53,730 Third, manual dexterity of a six-year-old. 509 00:23:53,730 --> 00:23:56,400 A six-year-old child can do this sort of force insertion. 510 00:23:56,400 --> 00:23:59,250 A six-year-old child can tie shoelaces, 511 00:23:59,250 --> 00:24:04,000 they can do everything that manual workers really do, 512 00:24:04,000 --> 00:24:06,000 but maybe with not as much strength. 513 00:24:06,000 --> 00:24:09,270 And lastly-- so we got two-year-olds, 514 00:24:09,270 --> 00:24:10,890 object recognition, four-year-old, 515 00:24:10,890 --> 00:24:12,540 language capability, six-year-old, 516 00:24:12,540 --> 00:24:14,820 manual dexterity-- 517 00:24:14,820 --> 00:24:16,320 the understanding that a 10-year-old 518 00:24:16,320 --> 00:24:19,260 has people in social settings about their intents, 519 00:24:19,260 --> 00:24:22,770 their goals, their subterfuges, if we get our robots 520 00:24:22,770 --> 00:24:25,395 to understand that, they will understand what we really want 521 00:24:25,395 --> 00:24:27,228 and why we're trying to do what we're doing. 522 00:24:27,228 --> 00:24:29,050 So to me, they are four big problems. 523 00:24:29,050 --> 00:24:32,070 Now, will you need to do non-monotic reasoning 524 00:24:32,070 --> 00:24:37,140 or probabilistic reasoning or common sense to do those? 525 00:24:37,140 --> 00:24:41,490 I'm somewhat agnostic about, but I think those capabilities, 526 00:24:41,490 --> 00:24:44,080 I think it's worth going after. 527 00:24:44,080 --> 00:24:44,580 Awesome. 528 00:24:44,580 --> 00:24:46,140 [INAUDIBLE] 529 00:24:46,140 --> 00:24:47,690 Yeah.