1 00:00:00,000 --> 00:00:03,167 2 00:00:03,167 --> 00:00:04,000 [INTERPOSING VOICES] 3 00:00:04,000 --> 00:00:05,140 What do I say? 4 00:00:05,140 --> 00:00:07,580 Well, I think I'll ask you to introduce yourself first. 5 00:00:07,580 --> 00:00:11,314 I'm Gerry Sussman, Gerald Jay Sussman. 6 00:00:11,314 --> 00:00:15,890 I've been at MIT since 1964 when I was a freshman. 7 00:00:15,890 --> 00:00:16,390 Very nice. 8 00:00:16,390 --> 00:00:18,100 With two years off, for good behavior. 9 00:00:18,100 --> 00:00:21,880 10 00:00:21,880 --> 00:00:24,680 So how did you get into the field back in '64 or shortly 11 00:00:24,680 --> 00:00:25,180 after? 12 00:00:25,180 --> 00:00:27,823 Well, actually, came a little earlier. 13 00:00:27,823 --> 00:00:30,490 As I explained in this morning's session, which probably-- well, 14 00:00:30,490 --> 00:00:32,032 you didn't have the camera on before. 15 00:00:32,032 --> 00:00:37,833 16 00:00:37,833 --> 00:00:39,500 At Erasmus Hall High School in Brooklyn, 17 00:00:39,500 --> 00:00:43,540 New York, a huge high school, I got 18 00:00:43,540 --> 00:00:46,150 involved with some Columbia program for high school 19 00:00:46,150 --> 00:00:47,480 students. 20 00:00:47,480 --> 00:00:50,740 And one of the teachers in that program 21 00:00:50,740 --> 00:00:54,700 was Joel Moses, who was an undergraduate at Columbia 22 00:00:54,700 --> 00:00:56,290 at the time. 23 00:00:56,290 --> 00:00:59,290 And he taught a class computer programming 24 00:00:59,290 --> 00:01:03,280 for high school students, which I took, 25 00:01:03,280 --> 00:01:05,687 and so that's how I got hooked on computers. 26 00:01:05,687 --> 00:01:07,270 The most interesting thing, of course, 27 00:01:07,270 --> 00:01:09,580 was I was also a ham radio operator. 28 00:01:09,580 --> 00:01:10,750 And there were these-- 29 00:01:10,750 --> 00:01:13,153 this machine with lots of electron tubes in it, and that 30 00:01:13,153 --> 00:01:15,070 was very interesting to me, because I've never 31 00:01:15,070 --> 00:01:17,335 seen something with 5,000 electron tubes. 32 00:01:17,335 --> 00:01:22,460 So that was sort of fun to understand those things. 33 00:01:22,460 --> 00:01:26,880 So I then got hooked on computation. 34 00:01:26,880 --> 00:01:28,960 Went off to look at various things, 35 00:01:28,960 --> 00:01:30,880 and I'd got pretty good at programming, 36 00:01:30,880 --> 00:01:35,200 and I suppose in the Columbia Library-- 37 00:01:35,200 --> 00:01:37,000 actually, the Library of library studies-- 38 00:01:37,000 --> 00:01:41,290 there was a book called The IPL-V Manual, 39 00:01:41,290 --> 00:01:43,877 and I read this thing and then said, "Oh, man. 40 00:01:43,877 --> 00:01:45,460 I could write an interpreter for this. 41 00:01:45,460 --> 00:01:45,770 It could be--" 42 00:01:45,770 --> 00:01:46,950 Well, I didn't know the word interpreter. 43 00:01:46,950 --> 00:01:49,158 "I could write a program that could really understand 44 00:01:49,158 --> 00:01:51,550 this language," and I did. 45 00:01:51,550 --> 00:01:55,450 So that was fun, and I was probably a junior 46 00:01:55,450 --> 00:01:57,700 in high school at that time. 47 00:01:57,700 --> 00:02:01,690 And then somewhere I read a book called Autonomous Studies, 48 00:02:01,690 --> 00:02:04,150 with Shannon and people like that, with Minsky's papers 49 00:02:04,150 --> 00:02:05,800 in it, McCarthy. 50 00:02:05,800 --> 00:02:08,960 And that said, MIT was the right place, 51 00:02:08,960 --> 00:02:15,168 so I showed up at MIT as a freshman in 1964. 52 00:02:15,168 --> 00:02:17,210 I didn't really know I was going to have anything 53 00:02:17,210 --> 00:02:18,170 to do with artificial intelligence training, 54 00:02:18,170 --> 00:02:20,690 I was just interested in playing with computers, 55 00:02:20,690 --> 00:02:23,280 but I was a math major. 56 00:02:23,280 --> 00:02:26,790 That was pretty clear to me, math and physics 57 00:02:26,790 --> 00:02:28,140 were what I liked a lot. 58 00:02:28,140 --> 00:02:31,710 And so somewhere in my beginning of my freshman 59 00:02:31,710 --> 00:02:36,030 year, I hung out with people who lived in Baker House 60 00:02:36,030 --> 00:02:41,820 at MIT, who were mostly physicists at that time. 61 00:02:41,820 --> 00:02:44,280 Alan Guth was an undergraduate, for example. 62 00:02:44,280 --> 00:02:45,780 Wow. 63 00:02:45,780 --> 00:02:47,670 There was a fellow named David Cressey who 64 00:02:47,670 --> 00:02:51,398 played with computers a lot, he was a physics undergraduate. 65 00:02:51,398 --> 00:02:53,190 He said, "There's this interesting computer 66 00:02:53,190 --> 00:02:55,800 over at Tech Square on the ninth floor 67 00:02:55,800 --> 00:02:57,690 that nobody seems to be using very much. 68 00:02:57,690 --> 00:02:59,490 Let's go play with it." 69 00:02:59,490 --> 00:03:02,820 So I went over there with him and there 70 00:03:02,820 --> 00:03:05,130 was a fellow who seemed to be in charge of it 71 00:03:05,130 --> 00:03:08,105 by the name of Greenblatt. 72 00:03:08,105 --> 00:03:10,230 He seemed to be an upperclassman and he didn't mind 73 00:03:10,230 --> 00:03:12,030 me playing with the computer. 74 00:03:12,030 --> 00:03:13,690 So I learned how to program this, 75 00:03:13,690 --> 00:03:15,610 and it was pretty painful. 76 00:03:15,610 --> 00:03:18,070 And although I'd programmed other computers before, 77 00:03:18,070 --> 00:03:21,690 this was sort of strangely documented. 78 00:03:21,690 --> 00:03:24,760 Although, it was a very nice machine. 79 00:03:24,760 --> 00:03:29,590 And so I made a little neural net that played tic-tac-toe. 80 00:03:29,590 --> 00:03:32,010 And one day this bald-headed fellow showed up, 81 00:03:32,010 --> 00:03:33,100 and it was obvious he was going to throw me out 82 00:03:33,100 --> 00:03:34,100 because he was the boss. 83 00:03:34,100 --> 00:03:36,100 Turned out it was Minsky. 84 00:03:36,100 --> 00:03:39,550 And he came over and asked me, "Gee, what are you up to?" 85 00:03:39,550 --> 00:03:41,380 And I explained to him what I was doing. 86 00:03:41,380 --> 00:03:43,720 I said, "How do I get paid for that sort of work?" 87 00:03:43,720 --> 00:03:45,483 So that was fun. 88 00:03:45,483 --> 00:03:47,650 And then he explained to me the most important thing 89 00:03:47,650 --> 00:03:49,950 I'd ever heard in my life, but I was a freshman, 90 00:03:49,950 --> 00:03:50,800 so I got hooked. 91 00:03:50,800 --> 00:03:54,730 He said, "Why do you make your network random?" 92 00:03:54,730 --> 00:03:58,335 I said, "I didn't want to have preconceived misconceptions." 93 00:03:58,335 --> 00:03:59,710 He said, "It has them, it's just, 94 00:03:59,710 --> 00:04:01,920 you don't know what they are." 95 00:04:01,920 --> 00:04:04,150 So that's how I got hooked into AI, 96 00:04:04,150 --> 00:04:07,390 because this was the smartest thing I'd ever 97 00:04:07,390 --> 00:04:09,440 heard anybody say in my life. 98 00:04:09,440 --> 00:04:12,530 So I did all sorts of things. 99 00:04:12,530 --> 00:04:15,860 At some point, Minsky put me as an undergraduate 100 00:04:15,860 --> 00:04:20,170 in charge of the summer vision project Minsky and Papert did. 101 00:04:20,170 --> 00:04:22,580 And that didn't work very well, but it was supposed 102 00:04:22,580 --> 00:04:25,104 to solve vision in one summer. 103 00:04:25,104 --> 00:04:26,312 [INTERPOSING VOICES] 104 00:04:26,312 --> 00:04:27,770 They [INAUDIBLE] didn't understand. 105 00:04:27,770 --> 00:04:29,395 Actually, that's the right thing to do. 106 00:04:29,395 --> 00:04:32,300 The right thing to do is always to take a hard problem 107 00:04:32,300 --> 00:04:34,780 that you don't know much about and find an undergraduate 108 00:04:34,780 --> 00:04:36,560 to work on it. 109 00:04:36,560 --> 00:04:39,380 I do that too, and the reason is because undergraduates 110 00:04:39,380 --> 00:04:41,690 don't know enough to know that it's hard 111 00:04:41,690 --> 00:04:44,302 and therefore you're going to see them try something, 112 00:04:44,302 --> 00:04:46,010 whereas somebody who knows that it's hard 113 00:04:46,010 --> 00:04:47,718 is not going to make any progress at all. 114 00:04:47,718 --> 00:04:48,290 Fair enough. 115 00:04:48,290 --> 00:04:50,030 So we found out what things were hard. 116 00:04:50,030 --> 00:04:51,260 A lot of things were hard. 117 00:04:51,260 --> 00:04:52,990 That's probably what Marvin wanted, 118 00:04:52,990 --> 00:04:54,490 he wanted to find out what was hard. 119 00:04:54,490 --> 00:04:57,200 120 00:04:57,200 --> 00:04:59,630 So I think that was very, very smart. 121 00:04:59,630 --> 00:05:01,940 I've been working on lots of things 122 00:05:01,940 --> 00:05:08,130 ever since, in both artificial intelligence and related 123 00:05:08,130 --> 00:05:08,850 fields. 124 00:05:08,850 --> 00:05:12,510 And so, for example, my doctoral thesis-- well, 125 00:05:12,510 --> 00:05:16,674 actually, my bachelor's thesis was a little theorem prover. 126 00:05:16,674 --> 00:05:19,110 [INAUDIBLE] 127 00:05:19,110 --> 00:05:23,160 I learned about resolution theorem proving 128 00:05:23,160 --> 00:05:25,420 and I thought it was a great thing. 129 00:05:25,420 --> 00:05:30,120 And around 1968 when I was a senior 130 00:05:30,120 --> 00:05:32,040 I wrote a little theorem prover that 131 00:05:32,040 --> 00:05:33,720 was sort of nice although, of course, 132 00:05:33,720 --> 00:05:35,550 it didn't do anything more than other people's theorem 133 00:05:35,550 --> 00:05:36,050 provers did. 134 00:05:36,050 --> 00:05:39,280 135 00:05:39,280 --> 00:05:42,320 I suppose I grew out of logic after a while. 136 00:05:42,320 --> 00:05:44,770 I wrote a PhD thesis about programs 137 00:05:44,770 --> 00:05:48,130 that debug programs, which was sort of fun. 138 00:05:48,130 --> 00:05:51,160 And as Marvin pointed out, nobody 139 00:05:51,160 --> 00:05:53,230 has really followed up on that very much 140 00:05:53,230 --> 00:05:55,850 and that's sad, because it was a good idea. 141 00:05:55,850 --> 00:05:57,220 I think it still is. 142 00:05:57,220 --> 00:06:01,180 I didn't follow up on it because I got attracted to lots 143 00:06:01,180 --> 00:06:02,680 of other things, you see? 144 00:06:02,680 --> 00:06:04,900 So I got involved in understanding how people-- 145 00:06:04,900 --> 00:06:05,733 I got into teaching. 146 00:06:05,733 --> 00:06:07,192 As soon as I got a doctor's degree, 147 00:06:07,192 --> 00:06:08,590 I became an assistant professor. 148 00:06:08,590 --> 00:06:11,270 It probably was a bad idea, but I did it anyway. 149 00:06:11,270 --> 00:06:13,630 And so the thing is I got into teaching 150 00:06:13,630 --> 00:06:16,570 electrical engineering, which is fun, 151 00:06:16,570 --> 00:06:18,520 since I liked electronics and was a ham radio 152 00:06:18,520 --> 00:06:19,600 operator and all that. 153 00:06:19,600 --> 00:06:23,080 154 00:06:23,080 --> 00:06:25,250 My main concerns started to be concerns 155 00:06:25,250 --> 00:06:27,700 about how to think about teaching better, 156 00:06:27,700 --> 00:06:32,860 how to understand how to explain to people the things that 157 00:06:32,860 --> 00:06:36,930 go on in the minds of good engineers and scientists. 158 00:06:36,930 --> 00:06:38,890 And so I started modeling those things. 159 00:06:38,890 --> 00:06:40,450 And so most of the work I've done 160 00:06:40,450 --> 00:06:46,420 in artificial intelligence has been understanding 161 00:06:46,420 --> 00:06:51,100 the techniques that are used by scientists and engineers 162 00:06:51,100 --> 00:06:54,910 and they're doing their work for the purpose of-- 163 00:06:54,910 --> 00:06:57,790 partly, for automating something, or partly 164 00:06:57,790 --> 00:07:04,020 because a formal precise description-- 165 00:07:04,020 --> 00:07:06,660 for example, in a computer program-- 166 00:07:06,660 --> 00:07:10,290 is something that can be used as an explanation. 167 00:07:10,290 --> 00:07:12,265 So if I get you the program and you 168 00:07:12,265 --> 00:07:14,190 know how to read the programming language, 169 00:07:14,190 --> 00:07:16,140 then you understand how to do the job 170 00:07:16,140 --> 00:07:17,280 that the program explains. 171 00:07:17,280 --> 00:07:20,130 So that was sort of a teaching idea. 172 00:07:20,130 --> 00:07:24,903 Most of the work I've done is basically in that direction, 173 00:07:24,903 --> 00:07:26,320 and that's what I'm very proud of. 174 00:07:26,320 --> 00:07:28,900 175 00:07:28,900 --> 00:07:31,810 Could you tell me about LISP and SCHEME? 176 00:07:31,810 --> 00:07:35,200 Well, that's not very important actually, 177 00:07:35,200 --> 00:07:36,450 but I'll tell you about it. 178 00:07:36,450 --> 00:07:37,160 Go ahead. 179 00:07:37,160 --> 00:07:38,620 Well, people always ask about that, 180 00:07:38,620 --> 00:07:40,870 because somehow it seems to be very associated 181 00:07:40,870 --> 00:07:42,715 with me and my friend Guy Steele who 182 00:07:42,715 --> 00:07:45,700 was an undergraduate student. 183 00:07:45,700 --> 00:07:49,190 But we never thought it was important. 184 00:07:49,190 --> 00:07:50,780 Let's see. 185 00:07:50,780 --> 00:07:52,480 The reason I ask-- 186 00:07:52,480 --> 00:07:53,950 LISP is a wonderful language. 187 00:07:53,950 --> 00:07:56,495 First of all, when I was a freshman, I was taught LISP. 188 00:07:56,495 --> 00:07:59,170 I was taught LISP by very, very smart people. 189 00:07:59,170 --> 00:08:02,740 Like Gosper, for example, put a lot of effort into that. 190 00:08:02,740 --> 00:08:06,450 And he had wonderful, wonderful ways to think about programs. 191 00:08:06,450 --> 00:08:08,732 He was a upperclassman, also. 192 00:08:08,732 --> 00:08:10,940 And I learned huge amount of programming from Gosper. 193 00:08:10,940 --> 00:08:19,250 194 00:08:19,250 --> 00:08:21,950 I'm a mathematician type person, so the thing 195 00:08:21,950 --> 00:08:24,260 that's attractive about something like LISP 196 00:08:24,260 --> 00:08:26,360 is that it's a language which, among other things, 197 00:08:26,360 --> 00:08:29,750 it's possible to express statements about the statements 198 00:08:29,750 --> 00:08:32,030 and language in the language. 199 00:08:32,030 --> 00:08:35,367 You could quote, and that you could talk about the quoted 200 00:08:35,367 --> 00:08:37,700 sentences, and that you can take them apart and put them 201 00:08:37,700 --> 00:08:40,460 together with [INAUDIBLE] and so on. 202 00:08:40,460 --> 00:08:41,970 The language itself is manipulable, 203 00:08:41,970 --> 00:08:43,428 and that's why you have things like 204 00:08:43,428 --> 00:08:45,440 the eval-apply interpreter. 205 00:08:45,440 --> 00:08:47,530 That's a beautiful, beautiful idea. 206 00:08:47,530 --> 00:08:50,960 207 00:08:50,960 --> 00:08:54,270 It had some bad properties, the LISP that we had. 208 00:08:54,270 --> 00:08:56,520 It was dynamically scoped, for example, which 209 00:08:56,520 --> 00:08:58,690 I thought was not that great. 210 00:08:58,690 --> 00:09:00,300 I don't remember why I learned that. 211 00:09:00,300 --> 00:09:02,505 I suppose it's because I-- again, a mathematician 212 00:09:02,505 --> 00:09:06,720 type logician person, I care about reference transparency. 213 00:09:06,720 --> 00:09:08,220 But I didn't think much about that. 214 00:09:08,220 --> 00:09:12,130 Every so often I had to invent a language to push my AI ideas. 215 00:09:12,130 --> 00:09:13,850 So I had to invent something. 216 00:09:13,850 --> 00:09:16,650 The first one was I had to make an implementation 217 00:09:16,650 --> 00:09:22,080 for a very small subset of Carl Hewitt's PLANNER language 218 00:09:22,080 --> 00:09:28,860 to help Terry Winograd finish his PhD thesis. 219 00:09:28,860 --> 00:09:31,555 I'm explaining that because it was a programming thing 220 00:09:31,555 --> 00:09:33,180 and I like programming and I understand 221 00:09:33,180 --> 00:09:34,560 how to write interpreters. 222 00:09:34,560 --> 00:09:35,958 I loved that idea. 223 00:09:35,958 --> 00:09:36,500 MicroPLANNER. 224 00:09:36,500 --> 00:09:37,670 So that was microPLANNER. 225 00:09:37,670 --> 00:09:40,010 And then the next thing was for my own stuff 226 00:09:40,010 --> 00:09:42,440 I wanted to make a little bit more elaborate thing, which 227 00:09:42,440 --> 00:09:44,360 turned out to be a flop, called CONNIVER. 228 00:09:44,360 --> 00:09:46,340 And it would have been called SCHEMER 229 00:09:46,340 --> 00:09:49,460 if it were-- it didn't take six letters. 230 00:09:49,460 --> 00:09:55,910 Because the size of a filename in PDP-6 or PDP-10 231 00:09:55,910 --> 00:09:59,540 ITS system was six letters, and so that becomes SCHEME. 232 00:09:59,540 --> 00:10:02,250 233 00:10:02,250 --> 00:10:07,300 But the way this occurred was because it's basically a LISP. 234 00:10:07,300 --> 00:10:10,460 There's no question, it's nothing special or new, 235 00:10:10,460 --> 00:10:10,960 it's a LISP. 236 00:10:10,960 --> 00:10:14,252 237 00:10:14,252 --> 00:10:18,280 Guy was a Harvard undergraduate who then became 238 00:10:18,280 --> 00:10:20,505 a graduate student at MIT. 239 00:10:20,505 --> 00:10:25,070 240 00:10:25,070 --> 00:10:28,430 We were sitting next to each other, I think, in a talk 241 00:10:28,430 --> 00:10:31,760 being given by Carl Hewitt about control 242 00:10:31,760 --> 00:10:35,167 as being patterns of passing messages. 243 00:10:35,167 --> 00:10:37,250 It was a very hard talk when you're having trouble 244 00:10:37,250 --> 00:10:39,230 understanding it. 245 00:10:39,230 --> 00:10:41,010 Well, I said to Guy or he said to me, 246 00:10:41,010 --> 00:10:42,920 "Let's try to figure this out." 247 00:10:42,920 --> 00:10:46,100 So we curl up in my office and he had spent all night. 248 00:10:46,100 --> 00:10:49,790 I called my wife up and said, "We're very busy," 249 00:10:49,790 --> 00:10:52,280 and she would complain that I left her in the lurch. 250 00:10:52,280 --> 00:10:56,150 But we stayed up all night and wrote a few interpreters. 251 00:10:56,150 --> 00:11:01,100 And the actor invocation mechanism 252 00:11:01,100 --> 00:11:02,960 that we thought we were inventing 253 00:11:02,960 --> 00:11:05,538 to fit what Carl's talked about, it 254 00:11:05,538 --> 00:11:07,330 turned out to be exactly the same as Lambda 255 00:11:07,330 --> 00:11:10,980 application in a less mixed up language. 256 00:11:10,980 --> 00:11:14,650 So we threw away the extra copy of that, and that was SCHEME. 257 00:11:14,650 --> 00:11:17,390 258 00:11:17,390 --> 00:11:21,045 SCHEME was an attempt to understand Carl Hewitt. 259 00:11:21,045 --> 00:11:21,670 That's how it-- 260 00:11:21,670 --> 00:11:22,430 Makes sense. 261 00:11:22,430 --> 00:11:23,483 Arose. 262 00:11:23,483 --> 00:11:24,322 Interesting. 263 00:11:24,322 --> 00:11:26,280 We didn't think much of it, other than the fact 264 00:11:26,280 --> 00:11:30,630 that because we had produced it, it 265 00:11:30,630 --> 00:11:33,000 was easily manipulable by us. 266 00:11:33,000 --> 00:11:36,600 And so Guy Steele wrote a master's thesis 267 00:11:36,600 --> 00:11:39,215 about compiling, which was very important, 268 00:11:39,215 --> 00:11:41,090 which invented the continuation-passing style 269 00:11:41,090 --> 00:11:43,290 and things like that. 270 00:11:43,290 --> 00:11:45,380 But that was the reason why he used it, 271 00:11:45,380 --> 00:11:49,020 it was because he understood the thing very well, the language. 272 00:11:49,020 --> 00:11:51,610 We invented this thing, this very nice interpreter 273 00:11:51,610 --> 00:11:53,980 that we can manipulate. 274 00:11:53,980 --> 00:11:55,980 And we never actually published papers about it, 275 00:11:55,980 --> 00:11:58,105 we wrote monthly AI memos. 276 00:11:58,105 --> 00:12:00,230 But we never thought it was sufficiently important. 277 00:12:00,230 --> 00:12:02,190 We were mostly AI people and it got 278 00:12:02,190 --> 00:12:08,562 [INAUDIBLE] got constraints, not about SCHEME. 279 00:12:08,562 --> 00:12:10,770 In fact the reason why SCHEME became more interesting 280 00:12:10,770 --> 00:12:12,410 to the world was, for the most part, 281 00:12:12,410 --> 00:12:15,710 due to Dan Friedman from University of Indiana 282 00:12:15,710 --> 00:12:18,830 who picked it up and said, "This is a good thing." 283 00:12:18,830 --> 00:12:21,950 It wasn't our deal of doing it all. 284 00:12:21,950 --> 00:12:22,790 How's that? 285 00:12:22,790 --> 00:12:25,820 That's good, thanks. 286 00:12:25,820 --> 00:12:32,448 So you were at the AI lab pretty soon after it was created, 287 00:12:32,448 --> 00:12:32,990 [INAUDIBLE]-- 288 00:12:32,990 --> 00:12:34,970 Well, originally it wasn't called the AI lab, 289 00:12:34,970 --> 00:12:38,035 it was the AI Group of Project MAC. 290 00:12:38,035 --> 00:12:43,950 Project MAC was a rather impressive but small community 291 00:12:43,950 --> 00:12:47,970 of people who did all sorts of things with computers. 292 00:12:47,970 --> 00:12:51,610 293 00:12:51,610 --> 00:12:55,020 Do you have any favorite anecdotes or cool stories 294 00:12:55,020 --> 00:12:56,182 from back then? 295 00:12:56,182 --> 00:12:59,000 And I've heard some talking to Rick Greenblatt, 296 00:12:59,000 --> 00:13:00,230 he had some pretty-- 297 00:13:00,230 --> 00:13:00,670 Yeah, well, probably one of-- 298 00:13:00,670 --> 00:13:01,260 Fun tales. 299 00:13:01,260 --> 00:13:03,860 The most interesting to me was there 300 00:13:03,860 --> 00:13:05,885 used to be a thing called TPSS there, 301 00:13:05,885 --> 00:13:09,060 they had a time sharing system. 302 00:13:09,060 --> 00:13:14,325 And there were these people called the hackers, of which I 303 00:13:14,325 --> 00:13:15,990 was sort of a marginal member. 304 00:13:15,990 --> 00:13:23,030 305 00:13:23,030 --> 00:13:25,910 Tom Knight was a high school student 306 00:13:25,910 --> 00:13:28,970 when I was a freshman so he was, I think, 307 00:13:28,970 --> 00:13:30,860 one year younger than me. 308 00:13:30,860 --> 00:13:35,510 So Stewart Nelson and Tom Knight, I think, at one point 309 00:13:35,510 --> 00:13:40,790 decide to invade and take over CTSS. 310 00:13:40,790 --> 00:13:44,810 And they had some plan, but they didn't know how to get the-- 311 00:13:44,810 --> 00:13:47,960 for some reason, what you had to do 312 00:13:47,960 --> 00:13:50,480 is overcome a particular operator. 313 00:13:50,480 --> 00:13:52,370 The operator who runs the big machine, 314 00:13:52,370 --> 00:13:54,008 there was an operator guy, he-- you 315 00:13:54,008 --> 00:13:56,300 have to be able to deposit some numbers and some memory 316 00:13:56,300 --> 00:13:57,140 locations. 317 00:13:57,140 --> 00:14:00,110 So I became the, what's the right word? 318 00:14:00,110 --> 00:14:05,615 319 00:14:05,615 --> 00:14:09,642 In the spy kind of novels and things, it's somebody who-- 320 00:14:09,642 --> 00:14:10,850 the guy who actually does it. 321 00:14:10,850 --> 00:14:13,640 322 00:14:13,640 --> 00:14:16,310 So I have to be friends with the operator. 323 00:14:16,310 --> 00:14:18,570 And then at one point at the appropriate time 324 00:14:18,570 --> 00:14:23,420 while Mr. Nelson was going to deposit 177 in the restriction 325 00:14:23,420 --> 00:14:29,420 code 4T2012881, which is Seymour Papert's login, 326 00:14:29,420 --> 00:14:32,120 I deposited the 177 in the right thing 327 00:14:32,120 --> 00:14:35,690 by taking over the data channel on the code 7909 328 00:14:35,690 --> 00:14:37,070 when the operator wasn't looking. 329 00:14:37,070 --> 00:14:40,613 330 00:14:40,613 --> 00:14:42,780 Well, actually, they didn't know this happened to me 331 00:14:42,780 --> 00:14:47,940 but Tom Knight got banished for a while from the building. 332 00:14:47,940 --> 00:14:50,860 But it was [LAUGHTER]---- it all worked out OK. 333 00:14:50,860 --> 00:14:51,810 [INTERPOSING VOICES] 334 00:14:51,810 --> 00:14:52,310 Right. 335 00:14:52,310 --> 00:14:56,760 So this was sort of one of the early computer 336 00:14:56,760 --> 00:14:59,210 unpleasantness-es. 337 00:14:59,210 --> 00:15:01,340 Right, fair enough. 338 00:15:01,340 --> 00:15:04,490 That was fun, actually. 339 00:15:04,490 --> 00:15:05,030 Let's see. 340 00:15:05,030 --> 00:15:08,120 341 00:15:08,120 --> 00:15:11,465 I guess when you were there at Tech Square-- 342 00:15:11,465 --> 00:15:12,925 Well, I still am. 343 00:15:12,925 --> 00:15:13,820 MIT. 344 00:15:13,820 --> 00:15:16,310 Tech Square's gone, that's the problem. 345 00:15:16,310 --> 00:15:19,220 Early years of Tech Square, in terms 346 00:15:19,220 --> 00:15:22,880 of communicating with other labs, 347 00:15:22,880 --> 00:15:26,960 was there really any kind of collaboration going on? 348 00:15:26,960 --> 00:15:28,150 Other labs? 349 00:15:28,150 --> 00:15:29,390 Other AI labs. 350 00:15:29,390 --> 00:15:30,020 Oh, AI labs? 351 00:15:30,020 --> 00:15:30,520 Well-- 352 00:15:30,520 --> 00:15:33,680 The impression I have gotten is that there are 353 00:15:33,680 --> 00:15:36,950 sort of different enclaves of-- 354 00:15:36,950 --> 00:15:39,110 Well, it is true that from time to time, 355 00:15:39,110 --> 00:15:40,940 John McCarthy would drop in because he 356 00:15:40,940 --> 00:15:43,160 was friends with Marvin. 357 00:15:43,160 --> 00:15:47,140 And I met him when I was an undergraduate. 358 00:15:47,140 --> 00:15:49,570 But remember, I was an undergraduate, 359 00:15:49,570 --> 00:15:53,960 so it's hard for me to estimate that. 360 00:15:53,960 --> 00:15:56,160 But I did have a lot of connection 361 00:15:56,160 --> 00:16:01,662 with a fellow by the name of Jerry Lettvin, 362 00:16:01,662 --> 00:16:04,910 a rather famous neurophysiologist 363 00:16:04,910 --> 00:16:08,120 and an electrical engineer who invented 364 00:16:08,120 --> 00:16:12,672 some of the early amplifiers for recording from nerve cells. 365 00:16:12,672 --> 00:16:17,950 And he was friends with Marvin, and he 366 00:16:17,950 --> 00:16:21,210 had some people he was with called McCulloch and Pitts, 367 00:16:21,210 --> 00:16:21,710 I think. 368 00:16:21,710 --> 00:16:24,380 I never met Pitts, but I used to get McCulloch's mail 369 00:16:24,380 --> 00:16:25,910 and bring it to him. 370 00:16:25,910 --> 00:16:28,400 And I got to be friends with Jerry Lettvin, who also taught 371 00:16:28,400 --> 00:16:31,670 classes in the humanities department, 372 00:16:31,670 --> 00:16:34,220 besides being a professor of biology 373 00:16:34,220 --> 00:16:37,280 and electrical engineering. 374 00:16:37,280 --> 00:16:40,730 So he taught a class with Giorgio de Santillana 375 00:16:40,730 --> 00:16:45,050 on some history, specifically The Crime of Galileo, 376 00:16:45,050 --> 00:16:47,810 which was a book written by Santillana explaining 377 00:16:47,810 --> 00:16:51,300 how-- why Galileo got himself into trouble. 378 00:16:51,300 --> 00:16:53,540 And so there was a lot of connection 379 00:16:53,540 --> 00:16:55,332 at that level inside MIT. 380 00:16:55,332 --> 00:16:57,290 But for an undergraduate, it's very hard for me 381 00:16:57,290 --> 00:16:59,890 to report on the connections between laboratories. 382 00:16:59,890 --> 00:17:01,140 Yeah, that's a great question. 383 00:17:01,140 --> 00:17:05,210 384 00:17:05,210 --> 00:17:07,438 As far as AI goes involved in the more theoretical 385 00:17:07,438 --> 00:17:09,230 [INAUDIBLE] constraints and [INAUDIBLE]---- 386 00:17:09,230 --> 00:17:09,905 I don't know. 387 00:17:09,905 --> 00:17:12,020 Theoretically, I did a lot of things that work. 388 00:17:12,020 --> 00:17:13,160 Yeah. 389 00:17:13,160 --> 00:17:14,177 Sorry. 390 00:17:14,177 --> 00:17:15,510 No, you don't have to apologize. 391 00:17:15,510 --> 00:17:18,020 I'm saying I'm a programmer. 392 00:17:18,020 --> 00:17:19,369 Right. 393 00:17:19,369 --> 00:17:24,094 Do you have any experience with robotics or use thereof? 394 00:17:24,094 --> 00:17:25,369 Well, sure. 395 00:17:25,369 --> 00:17:29,906 At the very beginning when I first arrived, 396 00:17:29,906 --> 00:17:32,180 people were building certain kinds of robot arms. 397 00:17:32,180 --> 00:17:35,450 And in particular, there was a arm made by AMF-- 398 00:17:35,450 --> 00:17:38,030 American Machine and Foundry-- 399 00:17:38,030 --> 00:17:42,260 which was a large hydraulic machine, quite dangerous, 400 00:17:42,260 --> 00:17:43,730 and I nearly got killed by it. 401 00:17:43,730 --> 00:17:43,975 Really? 402 00:17:43,975 --> 00:17:44,475 Oh, yes. 403 00:17:44,475 --> 00:17:46,218 How'd that happen? 404 00:17:46,218 --> 00:17:48,260 In fact, I think after I almost got killed by it, 405 00:17:48,260 --> 00:17:50,900 they put in a safety push off switch. 406 00:17:50,900 --> 00:17:53,630 So now, you got to have some sort of deadman control 407 00:17:53,630 --> 00:17:54,630 to use it. 408 00:17:54,630 --> 00:17:56,240 This was a very, very big machine 409 00:17:56,240 --> 00:17:58,883 used for automobile factories for putting the sheet metal 410 00:17:58,883 --> 00:18:00,050 into the press or something. 411 00:18:00,050 --> 00:18:00,883 It would go, "Womp." 412 00:18:00,883 --> 00:18:04,520 413 00:18:04,520 --> 00:18:08,473 And I suppose when we got it, it had a tape-driven controller 414 00:18:08,473 --> 00:18:10,640 where it would do the same thing over and over again 415 00:18:10,640 --> 00:18:12,350 and you could train it to do that. 416 00:18:12,350 --> 00:18:16,520 And to connect it to the PDP-6 computer 417 00:18:16,520 --> 00:18:19,940 required building a transistorized controller 418 00:18:19,940 --> 00:18:23,300 that would interface with it in a funny way. 419 00:18:23,300 --> 00:18:25,700 There were certain valves that had only eight positions 420 00:18:25,700 --> 00:18:30,112 of openness or something, and I remember 421 00:18:30,112 --> 00:18:32,320 there were several valves that had to be manipulated. 422 00:18:32,320 --> 00:18:35,350 And the various other things, like the mechanisms 423 00:18:35,350 --> 00:18:39,830 for measuring where it was, the various position controls. 424 00:18:39,830 --> 00:18:41,830 When it came, they were optical encoders 425 00:18:41,830 --> 00:18:44,320 and they were replaced for the moment 426 00:18:44,320 --> 00:18:49,800 by 10-turn potentiometers, which were glued into place 427 00:18:49,800 --> 00:18:51,180 with a minute cure epoxy. 428 00:18:51,180 --> 00:18:51,840 Oh, man. 429 00:18:51,840 --> 00:18:54,090 And now unfortunately, this is a very dangerous thing, 430 00:18:54,090 --> 00:18:56,170 and I didn't realize that. 431 00:18:56,170 --> 00:18:58,990 And it was very hard to control because it 432 00:18:58,990 --> 00:19:01,680 had such crude control. 433 00:19:01,680 --> 00:19:04,060 So one day when I was measuring its motion, 434 00:19:04,060 --> 00:19:06,115 Guy is writing programs to try and get it 435 00:19:06,115 --> 00:19:08,560 to stop and move to places, and it 436 00:19:08,560 --> 00:19:10,180 was sort of wobbling back and forth 437 00:19:10,180 --> 00:19:12,170 because it was being unstable. 438 00:19:12,170 --> 00:19:14,890 And one day I was measuring it with a ruler, seeing 439 00:19:14,890 --> 00:19:18,500 how much it was moving, and one of the pots fell out 440 00:19:18,500 --> 00:19:20,995 and the minute cure broke or something, 441 00:19:20,995 --> 00:19:22,120 and the thing went, "Womp." 442 00:19:22,120 --> 00:19:25,085 Back and forth like this, because its limits stop 443 00:19:25,085 --> 00:19:25,960 and then it reversed. 444 00:19:25,960 --> 00:19:28,418 Because my program says if it hits limits, time to reverse. 445 00:19:28,418 --> 00:19:29,020 You see? 446 00:19:29,020 --> 00:19:31,070 And it did this bouncing back and forth, 447 00:19:31,070 --> 00:19:33,778 sort of walking along and I started yelling and screaming, 448 00:19:33,778 --> 00:19:35,195 and Richard being like, "Hey, man, 449 00:19:35,195 --> 00:19:37,050 hit the stop switch of the computer," 450 00:19:37,050 --> 00:19:38,760 which helped me a lot because I thought 451 00:19:38,760 --> 00:19:40,180 I was going to be killed by this thing. 452 00:19:40,180 --> 00:19:40,680 Wow. 453 00:19:40,680 --> 00:19:42,430 Thing was swinging around wildly. 454 00:19:42,430 --> 00:19:44,740 Oh, boy. 455 00:19:44,740 --> 00:19:48,010 I could've been the first casualty of a robotics 456 00:19:48,010 --> 00:19:51,190 experiment, but it turned out it wasn't. 457 00:19:51,190 --> 00:19:55,280 Do you have any thoughts on robotics today 458 00:19:55,280 --> 00:19:57,540 and how it relates to AI? 459 00:19:57,540 --> 00:19:59,575 I didn't get involved in robotics after that. 460 00:19:59,575 --> 00:20:00,950 Not because I was afraid of them, 461 00:20:00,950 --> 00:20:02,830 but because it seemed to me there were other things that-- 462 00:20:02,830 --> 00:20:04,040 where you had more-- 463 00:20:04,040 --> 00:20:07,330 I decided that there was more intellectual bang per unit 464 00:20:07,330 --> 00:20:08,860 effort in other things. 465 00:20:08,860 --> 00:20:11,040 That makes sense. 466 00:20:11,040 --> 00:20:13,990 In fact, that's the way Marvin Minsky feels that way now, 467 00:20:13,990 --> 00:20:14,630 apparently. 468 00:20:14,630 --> 00:20:17,110 And I think I agree with him. 469 00:20:17,110 --> 00:20:21,043 If you want to have the maximum intellectual impact 470 00:20:21,043 --> 00:20:22,460 for the unit of amount of work you 471 00:20:22,460 --> 00:20:25,460 put in, then building physical robots is not 472 00:20:25,460 --> 00:20:27,020 the place where it happens. 473 00:20:27,020 --> 00:20:30,450 I can see that, definitely. 474 00:20:30,450 --> 00:20:35,720 So I guess you're more of a CS person, 475 00:20:35,720 --> 00:20:42,490 but how do you see AI impacting societies that feel-- 476 00:20:42,490 --> 00:20:45,460 we've heard that, for example, expert systems 477 00:20:45,460 --> 00:20:47,140 have branched out to a lot of industry 478 00:20:47,140 --> 00:20:47,410 and are used all over-- 479 00:20:47,410 --> 00:20:48,618 Sure, that's one application. 480 00:20:48,618 --> 00:20:49,150 The place. 481 00:20:49,150 --> 00:20:50,895 And there's lots of other things like that, too. 482 00:20:50,895 --> 00:20:51,395 Sure. 483 00:20:51,395 --> 00:20:53,950 484 00:20:53,950 --> 00:20:58,600 I think that it's very unlikely that much 485 00:20:58,600 --> 00:21:00,150 of the technology you see right now 486 00:21:00,150 --> 00:21:01,733 would be working if it weren't for AI. 487 00:21:01,733 --> 00:21:04,673 488 00:21:04,673 --> 00:21:06,340 There's a real question of whether we're 489 00:21:06,340 --> 00:21:09,910 talking about the core AI question of making 490 00:21:09,910 --> 00:21:12,460 intelligent machines, which clearly has not 491 00:21:12,460 --> 00:21:14,320 happened very much. 492 00:21:14,320 --> 00:21:17,270 There is not very much common sense in the mind of a machine, 493 00:21:17,270 --> 00:21:21,380 probably worse than I would have expected at the time. 494 00:21:21,380 --> 00:21:27,090 But there is a huge number of spin-off results 495 00:21:27,090 --> 00:21:30,370 that have become standard practice in the industries. 496 00:21:30,370 --> 00:21:33,600 So for example, you now may use a language called Java. 497 00:21:33,600 --> 00:21:36,270 It has a garbage collector, the garbage collector 498 00:21:36,270 --> 00:21:39,917 was invented by McCarthy and Minsky. 499 00:21:39,917 --> 00:21:42,000 The first paper on the garbage collector, I think, 500 00:21:42,000 --> 00:21:44,370 was the 1962 paper by Minsky. 501 00:21:44,370 --> 00:21:46,680 AI, now on number eight, I think. 502 00:21:46,680 --> 00:21:47,850 I'm not sure. 503 00:21:47,850 --> 00:21:53,350 It was for the PDP-1, but it took a long time for that. 504 00:21:53,350 --> 00:21:55,730 It took until 1990, approximately, 505 00:21:55,730 --> 00:21:57,560 for people to even realize that this 506 00:21:57,560 --> 00:21:59,210 was a good idea for commercial thing. 507 00:21:59,210 --> 00:22:03,210 And when it does it, it gets rid of all memory allocation 508 00:22:03,210 --> 00:22:03,710 errors. 509 00:22:03,710 --> 00:22:06,580 510 00:22:06,580 --> 00:22:07,650 That's an example. 511 00:22:07,650 --> 00:22:10,100 There's lots of technology like that. 512 00:22:10,100 --> 00:22:12,780 The ways which operating systems work, part of that 513 00:22:12,780 --> 00:22:14,405 came from AI type things and part of it 514 00:22:14,405 --> 00:22:16,910 came from other work. 515 00:22:16,910 --> 00:22:20,210 So a lot of the infrastructural things 516 00:22:20,210 --> 00:22:23,910 that go on your computer, I think, are AI lab outlets. 517 00:22:23,910 --> 00:22:28,760 So I guess back then when you were 518 00:22:28,760 --> 00:22:31,220 involved in AI lab activity, where 519 00:22:31,220 --> 00:22:34,073 did you see it going then? 520 00:22:34,073 --> 00:22:35,490 I don't know how to think of that. 521 00:22:35,490 --> 00:22:39,860 Again, you're talking about being an undergraduate. 522 00:22:39,860 --> 00:22:41,290 It was just fun. 523 00:22:41,290 --> 00:22:43,170 These were smart people. 524 00:22:43,170 --> 00:22:45,290 I'd go to dinner with very smart people 525 00:22:45,290 --> 00:22:46,940 and learn a lot of stuff. 526 00:22:46,940 --> 00:22:49,010 That was one of the things, some of them 527 00:22:49,010 --> 00:22:50,790 were mathematician types and some of them 528 00:22:50,790 --> 00:22:54,380 were people who were interested in just computing, 529 00:22:54,380 --> 00:22:55,580 and things like that. 530 00:22:55,580 --> 00:22:59,612 531 00:22:59,612 --> 00:23:02,420 It was very clear to me that there 532 00:23:02,420 --> 00:23:05,470 was a great mystery of how things think, 533 00:23:05,470 --> 00:23:07,760 or how to make things that could think. 534 00:23:07,760 --> 00:23:12,170 How it could be the way I could work. 535 00:23:12,170 --> 00:23:13,580 And that's a great mystery that's 536 00:23:13,580 --> 00:23:16,700 about the same level of mystery as to how the universe is 537 00:23:16,700 --> 00:23:19,250 put together as well. 538 00:23:19,250 --> 00:23:21,120 Physics is about that mystery. 539 00:23:21,120 --> 00:23:22,850 And I'm interested in both of them. 540 00:23:22,850 --> 00:23:25,230 the very large mystery about the universe 541 00:23:25,230 --> 00:23:29,370 and the very small mystery about how I work. 542 00:23:29,370 --> 00:23:35,160 And it seemed to me that the method of investigating 543 00:23:35,160 --> 00:23:38,040 the universe as a whole is more of this reductionist thing 544 00:23:38,040 --> 00:23:40,530 that physicists do and I like that, 545 00:23:40,530 --> 00:23:43,800 where you try to understand how the little pieces work 546 00:23:43,800 --> 00:23:46,460 and how little pieces are put together to make bigger pieces. 547 00:23:46,460 --> 00:23:50,410 So you get fundamental particles of atoms and things like that, 548 00:23:50,410 --> 00:23:52,170 and the forces that you have to deal with 549 00:23:52,170 --> 00:23:57,160 and fields and that sort of thing. 550 00:23:57,160 --> 00:23:59,350 And that's a wonderful thing to understand. 551 00:23:59,350 --> 00:24:01,060 But then there's another way to think about the world 552 00:24:01,060 --> 00:24:02,640 when you get very complicated things, 553 00:24:02,640 --> 00:24:10,390 like biological systems, or information processing systems. 554 00:24:10,390 --> 00:24:13,700 Like thinking that it's more synthetic 555 00:24:13,700 --> 00:24:15,470 will help you rather than analytic. 556 00:24:15,470 --> 00:24:17,630 You say, "If I had to make something 557 00:24:17,630 --> 00:24:20,660 that had this behavior, how would I make it?" 558 00:24:20,660 --> 00:24:23,180 Well, there might be only a few ways to do that. 559 00:24:23,180 --> 00:24:26,240 It would be nice if it may be true, may be false, 560 00:24:26,240 --> 00:24:28,670 that the way it actually works in-- say, a person-- 561 00:24:28,670 --> 00:24:30,330 is one of those ways. 562 00:24:30,330 --> 00:24:33,740 If I'm looking at it, and it is, then that 563 00:24:33,740 --> 00:24:35,490 would be a great thing. 564 00:24:35,490 --> 00:24:39,755 But in any case, I could think about how it might be to do it. 565 00:24:39,755 --> 00:24:41,402 Makes sense. 566 00:24:41,402 --> 00:24:43,985 I guess one last question, I'm going to throw you a curveball. 567 00:24:43,985 --> 00:24:45,830 Do you read science fiction? 568 00:24:45,830 --> 00:24:46,400 I have. 569 00:24:46,400 --> 00:24:48,420 Do I read it a lot? 570 00:24:48,420 --> 00:24:49,520 No. 571 00:24:49,520 --> 00:24:52,250 There are lots of people in the AI world 572 00:24:52,250 --> 00:24:54,140 who spend a lot of time on science fiction, 573 00:24:54,140 --> 00:24:55,895 and who like it a lot. 574 00:24:55,895 --> 00:24:59,570 As for me, I don't much like fiction at all, 575 00:24:59,570 --> 00:25:01,202 so I don't read much fiction. 576 00:25:01,202 --> 00:25:03,410 And the reason why is the world is interesting enough 577 00:25:03,410 --> 00:25:05,922 without having to make up made-up worlds. 578 00:25:05,922 --> 00:25:07,380 The real world is very interesting. 579 00:25:07,380 --> 00:25:08,655 Makes sense. 580 00:25:08,655 --> 00:25:10,280 What's your favorite thing you've read? 581 00:25:10,280 --> 00:25:11,430 Recently? 582 00:25:11,430 --> 00:25:13,990 Sure. 583 00:25:13,990 --> 00:25:18,315 I'm working on a book by Carlo Rovelli called Quantum Gravity. 584 00:25:18,315 --> 00:25:19,440 It is very hard. 585 00:25:19,440 --> 00:25:21,680 It's taken me about a day per page. 586 00:25:21,680 --> 00:25:23,020 Wow. 587 00:25:23,020 --> 00:25:23,780 Got you. 588 00:25:23,780 --> 00:25:25,550 All right, thanks very much for talking. 589 00:25:25,550 --> 00:25:27,100 Sure. 590 00:25:27,100 --> 00:25:29,000