1 00:00:00,000 --> 00:00:09,340 2 00:00:09,340 --> 00:00:11,485 I'll just ask you introduce yourself. 3 00:00:11,485 --> 00:00:14,450 Hi, I'm Danny Bobrow. 4 00:00:14,450 --> 00:00:17,770 I work at the Palo Alto Research Center, 5 00:00:17,770 --> 00:00:22,420 formerly called Xerox PARC, but now I just called PARC. 6 00:00:22,420 --> 00:00:30,114 And I first started working artificial intelligence in 1957 7 00:00:30,114 --> 00:00:36,730 when Marvin first came to MIT, and John-- 8 00:00:36,730 --> 00:00:40,960 just started the MIT AI project. 9 00:00:40,960 --> 00:00:45,310 And I took Marvin's first course in artificial intelligence, 10 00:00:45,310 --> 00:00:47,800 which was-- 11 00:00:47,800 --> 00:00:51,130 which spent a lot of time on Turing machines, 12 00:00:51,130 --> 00:00:52,738 because that was the model. 13 00:00:52,738 --> 00:00:56,500 14 00:00:56,500 --> 00:00:58,150 What was he like then? 15 00:00:58,150 --> 00:00:58,850 What was that? 16 00:00:58,850 --> 00:01:01,060 If I can jump in, what was he like then? 17 00:01:01,060 --> 00:01:04,420 Marvin was a 30-year-old-- 18 00:01:04,420 --> 00:01:07,540 if you can imagine-- 19 00:01:07,540 --> 00:01:15,040 full of energy, young professor who could do anything. 20 00:01:15,040 --> 00:01:16,510 He would program, he would-- 21 00:01:16,510 --> 00:01:19,290 22 00:01:19,290 --> 00:01:21,030 he's an incredible piano player. 23 00:01:21,030 --> 00:01:22,590 I don't know if you know that. 24 00:01:22,590 --> 00:01:28,320 He's a-- and composer, and inventor. 25 00:01:28,320 --> 00:01:33,000 He invented the scanning electron microscope, one 26 00:01:33,000 --> 00:01:34,500 of the first versions of that. 27 00:01:34,500 --> 00:01:35,080 No way. 28 00:01:35,080 --> 00:01:35,580 Really? 29 00:01:35,580 --> 00:01:36,540 Yeah. 30 00:01:36,540 --> 00:01:41,310 And I think one of his first inventions was something-- 31 00:01:41,310 --> 00:01:45,360 I forget what it was to detect metal in the eye, 32 00:01:45,360 --> 00:01:50,360 because his father was a physician. 33 00:01:50,360 --> 00:01:54,680 And when I joined him, I found that I 34 00:01:54,680 --> 00:01:59,000 wasn't joining kind of of a faculty club, 35 00:01:59,000 --> 00:02:01,460 it was kind of a family. 36 00:02:01,460 --> 00:02:05,690 Marvin always opens his house to graduate students, 37 00:02:05,690 --> 00:02:08,610 and people could just-- 38 00:02:08,610 --> 00:02:10,389 were over there all the time. 39 00:02:10,389 --> 00:02:13,596 And I was there with him when he had his-- 40 00:02:13,596 --> 00:02:18,900 when he had his first kids, and then when he had the twins. 41 00:02:18,900 --> 00:02:22,050 There was always interesting people around. 42 00:02:22,050 --> 00:02:26,880 He had a living room in his house over there, 43 00:02:26,880 --> 00:02:29,070 which has room for two grand pianos, 44 00:02:29,070 --> 00:02:31,200 plus electronic instruments. 45 00:02:31,200 --> 00:02:36,320 And he would play these things, among other things. 46 00:02:36,320 --> 00:02:39,850 And other people would come, play these things. 47 00:02:39,850 --> 00:02:45,730 He moved into this house and what he had 48 00:02:45,730 --> 00:02:50,800 there was more like a play pen. 49 00:02:50,800 --> 00:02:56,050 In his living room, since Cynthia was only 50 00:02:56,050 --> 00:02:59,743 a year older so, is he had all the boxes of books 51 00:02:59,743 --> 00:03:01,660 since he hadn't found a place to put them yet. 52 00:03:01,660 --> 00:03:04,000 And he arranged them around like a giant playpen 53 00:03:04,000 --> 00:03:07,390 in the middle of the floor, and she played in the middle. 54 00:03:07,390 --> 00:03:10,210 And everybody else was around the outside. 55 00:03:10,210 --> 00:03:13,970 The room was about three times the size of this room. 56 00:03:13,970 --> 00:03:18,730 And it was an old house, and he had 57 00:03:18,730 --> 00:03:22,510 put a bolt in one of the ceiling beams, and then a rope that you 58 00:03:22,510 --> 00:03:26,617 could go swinging back and forth across the playpen area. 59 00:03:26,617 --> 00:03:28,450 And then he had all sorts of toys and things 60 00:03:28,450 --> 00:03:31,850 that you could make things with. 61 00:03:31,850 --> 00:03:34,840 So if we step back to you before you joined the lab, 62 00:03:34,840 --> 00:03:36,940 what got you interested in AI before hand? 63 00:03:36,940 --> 00:03:42,760 There was-- it was just a dream of what AI was. 64 00:03:42,760 --> 00:03:48,220 And I loved the idea, from the a little bit-- 65 00:03:48,220 --> 00:03:52,790 all I read was the proposal for the project. 66 00:03:52,790 --> 00:03:58,570 I was offered a job as research assistant on the project. 67 00:03:58,570 --> 00:04:03,700 And it was Marvin Minsky, John McCarthy, 68 00:04:03,700 --> 00:04:08,380 and Claude Shannon were the three principals. 69 00:04:08,380 --> 00:04:12,970 But then Marvin had all sorts of other friends 70 00:04:12,970 --> 00:04:19,810 like Norbert Wiener, who he knew, 71 00:04:19,810 --> 00:04:23,950 or McCulloch and Pitts, who were people 72 00:04:23,950 --> 00:04:26,380 who did the neural models. 73 00:04:26,380 --> 00:04:33,970 And then McCulloch was a philosopher 74 00:04:33,970 --> 00:04:36,130 and wrote this wonderful paper called, 75 00:04:36,130 --> 00:04:37,970 "What is The Number that Man May Know It, 76 00:04:37,970 --> 00:04:41,720 And What Is a Man that He May Know a Number?" 77 00:04:41,720 --> 00:04:48,690 So we had all these different kinds of people around. 78 00:04:48,690 --> 00:04:53,820 And it was really an intellectual community 79 00:04:53,820 --> 00:04:56,940 where what you did, you read things 80 00:04:56,940 --> 00:05:00,540 and then you have discussions. 81 00:05:00,540 --> 00:05:04,580 We talk about things. 82 00:05:04,580 --> 00:05:06,842 I came up with an idea for my thesis, 83 00:05:06,842 --> 00:05:08,050 and I talked to him about it. 84 00:05:08,050 --> 00:05:09,962 He said, well, that's a good idea. 85 00:05:09,962 --> 00:05:11,920 I don't know if you heard my talk this morning, 86 00:05:11,920 --> 00:05:13,510 I mentioned it, anyways. 87 00:05:13,510 --> 00:05:17,380 And so it was doing algebra story problems. 88 00:05:17,380 --> 00:05:19,126 And Marvin thought that was a good idea, 89 00:05:19,126 --> 00:05:21,415 and he let me go on with it. 90 00:05:21,415 --> 00:05:24,670 91 00:05:24,670 --> 00:05:28,390 And if I ever came to him, he would answer questions. 92 00:05:28,390 --> 00:05:31,330 But he always believed that he was just 93 00:05:31,330 --> 00:05:37,490 providing the atmosphere for students to explore themselves. 94 00:05:37,490 --> 00:05:41,410 So I recently read that Eugene Charniak's thesis 95 00:05:41,410 --> 00:05:42,088 on [INAUDIBLE]. 96 00:05:42,088 --> 00:05:43,630 I don't know whether you're familiar. 97 00:05:43,630 --> 00:05:44,890 I'm quite familiar with that. 98 00:05:44,890 --> 00:05:49,450 He followed on-- his thesis on [INAUDIBLE] followed 99 00:05:49,450 --> 00:05:51,180 on to my student thesis. 100 00:05:51,180 --> 00:05:53,540 That's what I was going to ask. 101 00:05:53,540 --> 00:05:58,890 So he was he was in the next generation. 102 00:05:58,890 --> 00:06:01,770 So we were in the-- 103 00:06:01,770 --> 00:06:05,210 you might be interested in where the AI lab started. 104 00:06:05,210 --> 00:06:08,190 It started in the basement of building 26. 105 00:06:08,190 --> 00:06:08,690 Really? 106 00:06:08,690 --> 00:06:09,260 Yes. 107 00:06:09,260 --> 00:06:18,360 So we had a big room, which we had our own punch card 108 00:06:18,360 --> 00:06:24,240 machine so we could actually type in our things. 109 00:06:24,240 --> 00:06:29,190 And it was really quite amazing then. 110 00:06:29,190 --> 00:06:33,390 I think Tech Square opened in '61 or '62. 111 00:06:33,390 --> 00:06:34,830 1967. 112 00:06:34,830 --> 00:06:36,030 --like that. 113 00:06:36,030 --> 00:06:40,230 And to have all that space to be able to be outside 114 00:06:40,230 --> 00:06:43,110 was really quite amazing. 115 00:06:43,110 --> 00:06:44,620 What kind of stuff did you-- 116 00:06:44,620 --> 00:06:47,310 I guess, what sort of computer did you have in your-- 117 00:06:47,310 --> 00:06:50,940 Well, originally it was a 704, which was a tube machine. 118 00:06:50,940 --> 00:06:53,100 And then it got to be a 709. 119 00:06:53,100 --> 00:06:55,620 And then they finally got a 7094, 120 00:06:55,620 --> 00:06:58,360 which was the transistor version of this thing. 121 00:06:58,360 --> 00:07:07,540 And as I mentioned before, it had 32,036 bit words. 122 00:07:07,540 --> 00:07:11,780 That's where the-- and that was the size of the memory, 123 00:07:11,780 --> 00:07:13,670 and that was as good as it was going to get. 124 00:07:13,670 --> 00:07:17,240 That was the big machine around. 125 00:07:17,240 --> 00:07:21,360 And we had Lisp. 126 00:07:21,360 --> 00:07:23,400 And Lisp was not an interactive language. 127 00:07:23,400 --> 00:07:26,020 Lisp, you just put in a file in. 128 00:07:26,020 --> 00:07:29,100 A day or two later, you get back. 129 00:07:29,100 --> 00:07:31,932 We'll punch your printout, whatever you sent 130 00:07:31,932 --> 00:07:33,520 to print out from your thing. 131 00:07:33,520 --> 00:07:36,210 That's how you do things. 132 00:07:36,210 --> 00:07:38,490 I've heard of-- 133 00:07:38,490 --> 00:07:44,730 I've read a paper outlining Lisp as a mathematical construct 134 00:07:44,730 --> 00:07:47,700 before it was even brought into computers. 135 00:07:47,700 --> 00:07:51,660 Do you know much about this transition? 136 00:07:51,660 --> 00:07:52,860 Well, I was there. 137 00:07:52,860 --> 00:07:56,060 I watched McCarthy do this. 138 00:07:56,060 --> 00:08:02,480 McCarthy was-- McCarthy, it was a whole seminar he was running. 139 00:08:02,480 --> 00:08:05,690 It was a whole issue about what to do. 140 00:08:05,690 --> 00:08:06,880 It was really based on-- 141 00:08:06,880 --> 00:08:07,880 you should talk to John. 142 00:08:07,880 --> 00:08:09,240 John is here. 143 00:08:09,240 --> 00:08:11,430 Yeah, we're hoping to talk to him. 144 00:08:11,430 --> 00:08:14,720 145 00:08:14,720 --> 00:08:16,580 My impression was he was very impressed 146 00:08:16,580 --> 00:08:19,160 by [INAUDIBLE] mathematics. 147 00:08:19,160 --> 00:08:23,690 And [INAUDIBLE] had basically some mathematics 148 00:08:23,690 --> 00:08:28,070 which allowed the representation of various kinds 149 00:08:28,070 --> 00:08:28,760 of expressions. 150 00:08:28,760 --> 00:08:34,706 And John translated that. 151 00:08:34,706 --> 00:08:38,190 And in his paper, in his original paper 152 00:08:38,190 --> 00:08:41,820 on symbolic computation he talked about how 153 00:08:41,820 --> 00:08:45,850 those things fit together. 154 00:08:45,850 --> 00:08:51,790 So how long were you at MIT [INAUDIBLE]?? 155 00:08:51,790 --> 00:08:53,210 Let's see. 156 00:08:53,210 --> 00:08:55,400 The first year, I was actually at Harvard 157 00:08:55,400 --> 00:08:57,310 and got my master's degree from there. 158 00:08:57,310 --> 00:09:00,840 But since I was taking some courses in MIT, 159 00:09:00,840 --> 00:09:02,010 I was coming down. 160 00:09:02,010 --> 00:09:04,650 I went to Harvard because I was offered a position 161 00:09:04,650 --> 00:09:06,960 both at Harvard and MIT. 162 00:09:06,960 --> 00:09:10,200 And I chose Harvard because they said, oh, we'll 163 00:09:10,200 --> 00:09:11,850 give you research-- 164 00:09:11,850 --> 00:09:13,710 we'll give you a fellowship. 165 00:09:13,710 --> 00:09:17,180 At MIT, they were going to give me a research assistantship. 166 00:09:17,180 --> 00:09:23,370 Well, a fellowship must be better, but it was boring. 167 00:09:23,370 --> 00:09:30,240 Harvard applied mathematics did not have anything very exciting 168 00:09:30,240 --> 00:09:30,740 going on. 169 00:09:30,740 --> 00:09:32,960 There were a few interesting people coming 170 00:09:32,960 --> 00:09:38,450 out and [INAUDIBLE] few others. 171 00:09:38,450 --> 00:09:41,390 But at MIT, you had Marvin, and John, 172 00:09:41,390 --> 00:09:43,500 and all those other people that I mentioned. 173 00:09:43,500 --> 00:09:45,140 And so I came down. 174 00:09:45,140 --> 00:09:47,510 And things were loose enough at that time, 175 00:09:47,510 --> 00:09:51,620 I had gotten the offer from the mathematics department. 176 00:09:51,620 --> 00:09:55,670 And I just went down and talked to the secretary down there 177 00:09:55,670 --> 00:09:57,720 and said look, I think I made a mistake. 178 00:09:57,720 --> 00:09:59,846 I'd like to come here next year. 179 00:09:59,846 --> 00:10:01,580 She said, yeah, we still have your file. 180 00:10:01,580 --> 00:10:04,590 I think that would work. 181 00:10:04,590 --> 00:10:09,960 So I-- they made it work, like that. 182 00:10:09,960 --> 00:10:14,750 So I was there, well '57 or '58. 183 00:10:14,750 --> 00:10:18,390 In 1964 I got my degree. 184 00:10:18,390 --> 00:10:24,880 And then I was for postdoctoral fellow assistant professor 185 00:10:24,880 --> 00:10:27,620 for another year. 186 00:10:27,620 --> 00:10:33,010 So I was there for eight years at MIT. 187 00:10:33,010 --> 00:10:35,790 So this was the years-- 188 00:10:35,790 --> 00:10:39,180 we were just four or five or six graduate students working 189 00:10:39,180 --> 00:10:42,870 with Marvin and John, up to having the whole AI 190 00:10:42,870 --> 00:10:45,192 laboratory over in Tech Square. 191 00:10:45,192 --> 00:10:47,400 So you got to see the beginnings of a lot of projects 192 00:10:47,400 --> 00:10:49,550 that were going on here. 193 00:10:49,550 --> 00:10:51,330 That's right. 194 00:10:51,330 --> 00:10:54,250 Do you remember any particular-- if you 195 00:10:54,250 --> 00:10:56,610 had anything to say about anything in particular? 196 00:10:56,610 --> 00:10:59,470 Oh, let's see what kind of-- 197 00:10:59,470 --> 00:11:01,550 If you found anything specifically cool, or. 198 00:11:01,550 --> 00:11:05,290 199 00:11:05,290 --> 00:11:08,200 For a new scholar of artificial intelligence, 200 00:11:08,200 --> 00:11:10,940 or for people not super familiar with the field, 201 00:11:10,940 --> 00:11:14,470 what would you say were some of the most exciting things? 202 00:11:14,470 --> 00:11:15,580 Well, let's see. 203 00:11:15,580 --> 00:11:20,620 I still think some of the most interesting early work, 204 00:11:20,620 --> 00:11:22,870 one of the interesting pieces of early work 205 00:11:22,870 --> 00:11:26,260 was Tom Evan's thesis on analogy. 206 00:11:26,260 --> 00:11:29,140 I don't know whether that's still done, 207 00:11:29,140 --> 00:11:34,510 but that was a really marvelous piece of work 208 00:11:34,510 --> 00:11:38,820 where he took standard college board analogy problems. 209 00:11:38,820 --> 00:11:40,900 He had two programs, one of which 210 00:11:40,900 --> 00:11:43,630 transformed the pictures into descriptions, 211 00:11:43,630 --> 00:11:47,290 and the other which did the analogy work, actually 212 00:11:47,290 --> 00:11:50,050 do very well on the analogy tests. 213 00:11:50,050 --> 00:11:54,200 So he did that. 214 00:11:54,200 --> 00:11:58,190 The other thing was that whole development of what Lisp was 215 00:11:58,190 --> 00:12:01,340 and what it meant to be able to have this. 216 00:12:01,340 --> 00:12:03,830 And the idea that inside of this, 217 00:12:03,830 --> 00:12:06,140 you could build other languages. 218 00:12:06,140 --> 00:12:10,150 I mean, a great insight for me was 219 00:12:10,150 --> 00:12:15,412 looking at the interpreter for Lisp written in Lisp, realizing 220 00:12:15,412 --> 00:12:16,995 that if you build an interpreter Lisp, 221 00:12:16,995 --> 00:12:20,620 you could build an interpreter for other kinds of languages. 222 00:12:20,620 --> 00:12:23,470 And, in fact, that's what I did for my thesis, 223 00:12:23,470 --> 00:12:26,260 is I built another language which 224 00:12:26,260 --> 00:12:33,440 made it easy to transform those natural language statements 225 00:12:33,440 --> 00:12:35,300 into algebra story problems. 226 00:12:35,300 --> 00:12:37,160 And I built another language which 227 00:12:37,160 --> 00:12:39,690 was good at solving equations. 228 00:12:39,690 --> 00:12:41,340 So I built these languages. 229 00:12:41,340 --> 00:12:44,150 So the heart of my thesis was only 230 00:12:44,150 --> 00:12:46,490 like six or seven pages of text, which 231 00:12:46,490 --> 00:12:52,120 just had those transformations of the language part. 232 00:12:52,120 --> 00:12:56,670 But it was 30 pages for the interpreter before that. 233 00:12:56,670 --> 00:13:00,290 So that was what that was about. 234 00:13:00,290 --> 00:13:08,890 And then there was the entire stuff about building, 235 00:13:08,890 --> 00:13:14,320 it was time sharing system work that was going on 236 00:13:14,320 --> 00:13:17,610 at MIT at that time. 237 00:13:17,610 --> 00:13:21,390 And this was another revolution, because as I 238 00:13:21,390 --> 00:13:24,900 was saying, when I was doing my thesis, I had a deck of cards. 239 00:13:24,900 --> 00:13:26,520 This was this big. 240 00:13:26,520 --> 00:13:30,390 And I had to bring that in and carefully edit the deck, 241 00:13:30,390 --> 00:13:32,800 put the cards in the right place so it would 242 00:13:32,800 --> 00:13:34,900 load the program correctly. 243 00:13:34,900 --> 00:13:37,200 And then I'd submit the deck, and they 244 00:13:37,200 --> 00:13:39,600 would give me back the deck. 245 00:13:39,600 --> 00:13:42,900 And they would give me back the printout day or day 246 00:13:42,900 --> 00:13:44,700 and a half later. 247 00:13:44,700 --> 00:13:51,750 But when we were building the sharing system, 248 00:13:51,750 --> 00:13:53,590 we could, when we weren't actually testing 249 00:13:53,590 --> 00:13:56,900 the sharing system, get to use the machine ourselves. 250 00:13:56,900 --> 00:14:02,880 I had the entire $5 million machine that I could work with. 251 00:14:02,880 --> 00:14:07,330 I can get rapid turnaround, meaning putting in a run 252 00:14:07,330 --> 00:14:09,463 and getting it back 10 minutes later. 253 00:14:09,463 --> 00:14:10,630 What kind of machine was it? 254 00:14:10,630 --> 00:14:13,770 This was a 7094. 255 00:14:13,770 --> 00:14:21,110 And then what they did was, when they put a sharing in, 256 00:14:21,110 --> 00:14:23,400 I could sit there and I could actually get results out 257 00:14:23,400 --> 00:14:24,930 incrementally. 258 00:14:24,930 --> 00:14:28,800 And it just changed the way you thought about the work. 259 00:14:28,800 --> 00:14:33,260 Imagine if you had to today do your programming 260 00:14:33,260 --> 00:14:38,060 and you have to go away for an hour to get a compilation done, 261 00:14:38,060 --> 00:14:40,450 or you have to do something. 262 00:14:40,450 --> 00:14:42,430 It just change this thing. 263 00:14:42,430 --> 00:14:45,800 This was another insight of John McCarthy's who 264 00:14:45,800 --> 00:14:47,132 did this whole thing. 265 00:14:47,132 --> 00:14:51,290 266 00:14:51,290 --> 00:14:53,940 And, let's see, what were the kind of--? 267 00:14:53,940 --> 00:15:02,730 Oh, I guess there was work that was Richard Greenblatt's 268 00:15:02,730 --> 00:15:10,590 work on chess, which I always thought he was really fun. 269 00:15:10,590 --> 00:15:12,740 I met him [INAUDIBLE]. 270 00:15:12,740 --> 00:15:16,650 Were you involved at all in the [INAUDIBLE] group? 271 00:15:16,650 --> 00:15:18,340 I was not. 272 00:15:18,340 --> 00:15:19,390 I visited there. 273 00:15:19,390 --> 00:15:20,510 It was nice to see. 274 00:15:20,510 --> 00:15:23,705 But I never got-- 275 00:15:23,705 --> 00:15:26,980 I had a lot of other kinds of interests. 276 00:15:26,980 --> 00:15:30,175 I actually enjoy going out on dates, and things like that. 277 00:15:30,175 --> 00:15:33,880 [LAUGHTER] 278 00:15:33,880 --> 00:15:38,500 Yeah, well I-- anyway, especially toward the end 279 00:15:38,500 --> 00:15:45,140 I got married and had a child. 280 00:15:45,140 --> 00:15:50,800 So one thing is, I had this dream that I could really 281 00:15:50,800 --> 00:15:54,790 carry on having done language processing 282 00:15:54,790 --> 00:15:59,400 for this first algebra story problems. 283 00:15:59,400 --> 00:16:03,000 I said, well now, I would be able to do better. 284 00:16:03,000 --> 00:16:08,010 And I had my first child, my daughter, my oldest daughter, 285 00:16:08,010 --> 00:16:12,410 just a week after I turned in my thesis. 286 00:16:12,410 --> 00:16:16,200 Using parallel processing here. 287 00:16:16,200 --> 00:16:21,320 Fortunately, the heavy work was being done by my wife. 288 00:16:21,320 --> 00:16:23,560 Well, it was quite good of her to wait that week, 289 00:16:23,560 --> 00:16:26,190 because she was typing up my thesis, 290 00:16:26,190 --> 00:16:28,466 and getting further and further from the typewriter. 291 00:16:28,466 --> 00:16:32,440 292 00:16:32,440 --> 00:16:38,620 But she had typing skills at that point I had not developed. 293 00:16:38,620 --> 00:16:41,110 But then I had this dream I said, oh look, 294 00:16:41,110 --> 00:16:44,080 I have a brand new child. 295 00:16:44,080 --> 00:16:47,980 Now I ought to be able to keep up with language learning 296 00:16:47,980 --> 00:16:48,880 my child is doing. 297 00:16:48,880 --> 00:16:53,540 How proud-- I'll be able to program and do things that way. 298 00:16:53,540 --> 00:16:58,060 And after about three years, I realized 299 00:16:58,060 --> 00:17:00,710 I was just not keeping up. 300 00:17:00,710 --> 00:17:06,700 So I had another child, another daughter. 301 00:17:06,700 --> 00:17:10,339 And still, the problem was harder than I thought. 302 00:17:10,339 --> 00:17:14,200 But I thought, this I should learn, one shot learning. 303 00:17:14,200 --> 00:17:16,990 It's not going to work to just keep having 304 00:17:16,990 --> 00:17:20,270 the children to keep up with. 305 00:17:20,270 --> 00:17:25,730 So do you have any favorite anecdotes or fun stories 306 00:17:25,730 --> 00:17:30,820 from the AI lab, either [INAUDIBLE]?? 307 00:17:30,820 --> 00:17:35,230 308 00:17:35,230 --> 00:17:36,270 Let's see. 309 00:17:36,270 --> 00:17:38,020 I don't know whether anybody ever told you 310 00:17:38,020 --> 00:17:42,410 that they actually hacked into the elevator control system 311 00:17:42,410 --> 00:17:45,910 so that if you wanted the elevator to come up, 312 00:17:45,910 --> 00:17:48,190 at one point they actually hacked into it so that you 313 00:17:48,190 --> 00:17:55,310 could, from your terminal, you could get the thing 314 00:17:55,310 --> 00:17:58,140 to get up to the ninth floor. 315 00:17:58,140 --> 00:18:02,300 There were people who decided they didn't like that. 316 00:18:02,300 --> 00:18:06,338 It was some military group, I forget on which floor. 317 00:18:06,338 --> 00:18:08,130 And the fact that they were being called up 318 00:18:08,130 --> 00:18:11,440 to the ninth floor when they thought they were going down, 319 00:18:11,440 --> 00:18:13,350 they decided this was not fun. 320 00:18:13,350 --> 00:18:18,780 Anyway, so they were hacking into the [INAUDIBLE].. 321 00:18:18,780 --> 00:18:24,530 I guess having been basically in the field of AI 322 00:18:24,530 --> 00:18:27,280 and seeing the whole thing go from nothing to what 323 00:18:27,280 --> 00:18:30,380 it is today, I guess what are some 324 00:18:30,380 --> 00:18:33,680 of the problems you see that have been solved, 325 00:18:33,680 --> 00:18:36,200 and what are some of the challenges that have been 326 00:18:36,200 --> 00:18:38,700 with us from the beginning? 327 00:18:38,700 --> 00:18:39,850 Well, let's see. 328 00:18:39,850 --> 00:18:43,560 329 00:18:43,560 --> 00:18:49,710 The challenge, I think the continuing challenge has always 330 00:18:49,710 --> 00:18:54,870 been how to transform and represent the knowledge 331 00:18:54,870 --> 00:18:57,870 that we have, and that way use it-- 332 00:18:57,870 --> 00:19:02,330 not just declarative knowledge, but the procedural knowledge-- 333 00:19:02,330 --> 00:19:07,080 into a form that makes it possible to manipulate 334 00:19:07,080 --> 00:19:09,260 in the machine. 335 00:19:09,260 --> 00:19:14,500 As I mentioned, I spent a lot of time on building new languages. 336 00:19:14,500 --> 00:19:16,570 Because that turns out, I've always 337 00:19:16,570 --> 00:19:19,570 believed that the right way to solve a problem 338 00:19:19,570 --> 00:19:22,000 and really solve a class of problems 339 00:19:22,000 --> 00:19:26,610 is to build a language in which the solution to the problem 340 00:19:26,610 --> 00:19:30,530 is expressed very easily. 341 00:19:30,530 --> 00:19:33,100 So if that takes building a language in which you 342 00:19:33,100 --> 00:19:37,840 can express that, [INAUDIBLE] is there. 343 00:19:37,840 --> 00:19:43,130 So I think we've seen more and more of these things, more 344 00:19:43,130 --> 00:19:46,080 languages, more ways of expressing these things, more 345 00:19:46,080 --> 00:19:48,510 ways of doing the manipulation. 346 00:19:48,510 --> 00:19:53,740 And that's, I think, one of the continuing challenges 347 00:19:53,740 --> 00:19:56,290 that we see today is, how do you do 348 00:19:56,290 --> 00:20:00,350 that in a way that makes it be coherent, 349 00:20:00,350 --> 00:20:03,005 and makes it be livable? 350 00:20:03,005 --> 00:20:05,770 351 00:20:05,770 --> 00:20:12,640 The other challenge that I see as going on is what-- 352 00:20:12,640 --> 00:20:15,010 I call it my two baby problem. 353 00:20:15,010 --> 00:20:18,260 It's really hard to scale. 354 00:20:18,260 --> 00:20:27,150 It's really hard to make things get past the first three 355 00:20:27,150 --> 00:20:28,823 examples. 356 00:20:28,823 --> 00:20:30,240 And I think one of the reasons why 357 00:20:30,240 --> 00:20:32,010 we see a lot of machine learning today 358 00:20:32,010 --> 00:20:34,770 is because people say, oh, well now I will be 359 00:20:34,770 --> 00:20:36,840 able to deal with lots of data. 360 00:20:36,840 --> 00:20:38,776 And they do it on the average. 361 00:20:38,776 --> 00:20:42,750 362 00:20:42,750 --> 00:20:46,650 But what have we done? 363 00:20:46,650 --> 00:20:53,510 We've done expert systems were success in that, 364 00:20:53,510 --> 00:20:55,625 it was a language that was-- 365 00:20:55,625 --> 00:20:57,500 a set of languages that were designed to make 366 00:20:57,500 --> 00:20:59,670 it easy to express knowledge. 367 00:20:59,670 --> 00:21:02,490 And they're now in use today all over. 368 00:21:02,490 --> 00:21:07,850 There are a lot of vision programs. 369 00:21:07,850 --> 00:21:11,390 There's ways that language, natural language is being 370 00:21:11,390 --> 00:21:15,280 used in very limited domains. 371 00:21:15,280 --> 00:21:20,900 But I still think the general problem is still wide open. 372 00:21:20,900 --> 00:21:24,490 Lots of problems to be solved, lots of interesting. 373 00:21:24,490 --> 00:21:26,425 Makes sense. 374 00:21:26,425 --> 00:21:31,600 So I guess, so when you, not necessarily 375 00:21:31,600 --> 00:21:35,800 at the inception in the field, but I guess during your time 376 00:21:35,800 --> 00:21:37,990 when you were at MIT during the really early years 377 00:21:37,990 --> 00:21:41,670 of the field, where did you see it going? 378 00:21:41,670 --> 00:21:46,400 Where did you think you were going to be in 20 years? 379 00:21:46,400 --> 00:21:50,450 Where did you think we'd find ourselves in [INAUDIBLE],, 380 00:21:50,450 --> 00:21:51,806 in terms of AI? 381 00:21:51,806 --> 00:21:57,160 382 00:21:57,160 --> 00:21:59,020 One of the things that-- 383 00:21:59,020 --> 00:22:03,050 one of the problems I thought we faced 384 00:22:03,050 --> 00:22:07,790 was a version of my natural language story. 385 00:22:07,790 --> 00:22:12,191 Which is every time we were able to solve 386 00:22:12,191 --> 00:22:15,050 some simple version of a problem, 387 00:22:15,050 --> 00:22:19,960 we thought the next step would be easy. 388 00:22:19,960 --> 00:22:24,520 And it's basically-- so language, we 389 00:22:24,520 --> 00:22:33,107 would have in 20 years and pattern recognition. 390 00:22:33,107 --> 00:22:37,780 391 00:22:37,780 --> 00:22:41,370 We thought it would be easy to do all these things. 392 00:22:41,370 --> 00:22:43,680 And none of these things turned out to be simple. 393 00:22:43,680 --> 00:22:46,740 The world turns out to be much more complicated. 394 00:22:46,740 --> 00:22:51,040 There's far more interaction among things, 395 00:22:51,040 --> 00:22:55,570 as in this recent panel somebody was 396 00:22:55,570 --> 00:23:00,880 talking about the brain is a 400,000 year-old legacy 397 00:23:00,880 --> 00:23:01,992 application. 398 00:23:01,992 --> 00:23:05,200 [LAUGHTER] 399 00:23:05,200 --> 00:23:09,880 It's one continuous add on from a basic fish brain. 400 00:23:09,880 --> 00:23:18,990 And so it's not that it's simple in structure. 401 00:23:18,990 --> 00:23:22,410 And there's a-- somebody else was referring to, I guess, 402 00:23:22,410 --> 00:23:30,120 a recent paper by Jerry Sussman about why it is that we want-- 403 00:23:30,120 --> 00:23:33,444 how do we build robust systems? 404 00:23:33,444 --> 00:23:36,140 And to build robust systems, you don't 405 00:23:36,140 --> 00:23:38,970 want to get minimal engineering models. 406 00:23:38,970 --> 00:23:43,200 We want to get things which have lots 407 00:23:43,200 --> 00:23:49,230 of redundancy and alternative ways of doing things. 408 00:23:49,230 --> 00:23:52,560 And I think we haven't yet figured 409 00:23:52,560 --> 00:23:58,340 out an architecture which allows us to do that sort of thing. 410 00:23:58,340 --> 00:24:00,420 We keep trying to make the problems simpler. 411 00:24:00,420 --> 00:24:04,360 412 00:24:04,360 --> 00:24:09,470 One of the problems today AI is faced is math envy. 413 00:24:09,470 --> 00:24:16,250 Namely, mathematics gets reduced to a few forms, 414 00:24:16,250 --> 00:24:19,070 a few equations. 415 00:24:19,070 --> 00:24:20,860 And we can extrapolate from those, 416 00:24:20,860 --> 00:24:22,720 and everything follows from those. 417 00:24:22,720 --> 00:24:24,220 But I don't think that's going to be 418 00:24:24,220 --> 00:24:28,310 true for AI, or intelligence. 419 00:24:28,310 --> 00:24:29,468 We have two minutes. 420 00:24:29,468 --> 00:24:31,790 We have about a minute. 421 00:24:31,790 --> 00:24:32,510 Oh, shoot. 422 00:24:32,510 --> 00:24:34,310 OK, I better be quick. 423 00:24:34,310 --> 00:24:36,170 One more question. 424 00:24:36,170 --> 00:24:38,330 If you were-- 425 00:24:38,330 --> 00:24:41,280 I guess, besides your own work, which obviously 426 00:24:41,280 --> 00:24:44,210 is coolest thing out there, or you wouldn't be working on it. 427 00:24:44,210 --> 00:24:46,280 What else do you find really, really neat 428 00:24:46,280 --> 00:24:48,364 going on in the field right now? 429 00:24:48,364 --> 00:24:49,660 What makes you go wow? 430 00:24:49,660 --> 00:24:51,680 Well, let's see. 431 00:24:51,680 --> 00:24:53,090 What makes me go wow. 432 00:24:53,090 --> 00:24:59,300 Well, there are two things that I 433 00:24:59,300 --> 00:25:03,380 think were discussed in this last panel, one of which 434 00:25:03,380 --> 00:25:05,480 is some of the robotics work. 435 00:25:05,480 --> 00:25:11,150 I think actually being able to get in there 436 00:25:11,150 --> 00:25:13,820 and affect and interact with the world is 437 00:25:13,820 --> 00:25:16,400 really fun and exciting. 438 00:25:16,400 --> 00:25:19,430 And it really challenges you, because, as I said, 439 00:25:19,430 --> 00:25:21,790 the world is not simple. 440 00:25:21,790 --> 00:25:25,340 the combinator-- you can't solve everything in advance. 441 00:25:25,340 --> 00:25:28,340 You have to be able to use feedback. 442 00:25:28,340 --> 00:25:31,700 And the other intriguing one is work 443 00:25:31,700 --> 00:25:34,180 that Tom Mitchell was just talking about, 444 00:25:34,180 --> 00:25:40,710 which is how do we use brain instruments to teach us 445 00:25:40,710 --> 00:25:42,670 things about intelligence? 446 00:25:42,670 --> 00:25:44,810 How do we use our--