1 00:00:00,000 --> 00:00:03,110 2 00:00:03,110 --> 00:00:05,050 Thanks for visiting today. 3 00:00:05,050 --> 00:00:05,550 OK. 4 00:00:05,550 --> 00:00:06,140 Great. 5 00:00:06,140 --> 00:00:07,850 So yes, good afternoon. 6 00:00:07,850 --> 00:00:10,490 So my name is Chuck Rich. 7 00:00:10,490 --> 00:00:12,335 I got interested in AI when I was 8 00:00:12,335 --> 00:00:14,450 an undergraduate at the University of Toronto 9 00:00:14,450 --> 00:00:15,900 around 19-- 10 00:00:15,900 --> 00:00:17,960 early 1970s. 11 00:00:17,960 --> 00:00:21,710 I came to Tech Square as an incoming graduate 12 00:00:21,710 --> 00:00:25,710 student in 1973 to the AI Lab. 13 00:00:25,710 --> 00:00:30,280 And I left in 1991. 14 00:00:30,280 --> 00:00:31,730 I only managed to kind of make-- 15 00:00:31,730 --> 00:00:33,150 I didn't make escape velocity. 16 00:00:33,150 --> 00:00:35,670 I'm still sort of in orbit around Tech Square. 17 00:00:35,670 --> 00:00:38,570 I'm currently working at Mitsubishi Electric Research 18 00:00:38,570 --> 00:00:41,390 Labs at Cambridge and in Kendall Square, about a block 19 00:00:41,390 --> 00:00:43,205 and a half from Tech Square. 20 00:00:43,205 --> 00:00:44,780 I've been there 15 years. 21 00:00:44,780 --> 00:00:49,320 I was one of the founding members of that laboratory. 22 00:00:49,320 --> 00:00:49,820 Let me see. 23 00:00:49,820 --> 00:00:54,460 The kind of work I did in the field was basically-- 24 00:00:54,460 --> 00:01:02,720 so the first 7 and 3/4 years, I was doing my master's and PhD. 25 00:01:02,720 --> 00:01:05,030 Both of which were concerned with a project that 26 00:01:05,030 --> 00:01:09,020 was called the Programmer's Apprentice project that 27 00:01:09,020 --> 00:01:11,810 was started by initially myself and Howie Shrobe, 28 00:01:11,810 --> 00:01:13,820 and later joined by Dick waters. 29 00:01:13,820 --> 00:01:16,718 And that project was really overlap between AI 30 00:01:16,718 --> 00:01:17,760 and software engineering. 31 00:01:17,760 --> 00:01:21,170 So it was using software engineering 32 00:01:21,170 --> 00:01:23,450 as a domain for doing basic research and AI, 33 00:01:23,450 --> 00:01:25,280 so [INAUDIBLE], reasoning. 34 00:01:25,280 --> 00:01:29,450 And also showing that AI could be applied to making better 35 00:01:29,450 --> 00:01:31,190 software engineering tools. 36 00:01:31,190 --> 00:01:34,430 So it's an early project in the sort of intersection between AI 37 00:01:34,430 --> 00:01:36,980 and software engineering. 38 00:01:36,980 --> 00:01:42,350 And so basically, a couple of little 39 00:01:42,350 --> 00:01:45,020 anecdotes along the way about the time 40 00:01:45,020 --> 00:01:49,820 and about Tech Square, about my experience there. 41 00:01:49,820 --> 00:01:52,790 Because one of the first interesting claims to fame 42 00:01:52,790 --> 00:01:56,660 that Howie Shrobe and I had is-- 43 00:01:56,660 --> 00:01:59,510 so our supervisor was Gerry Sussman. 44 00:01:59,510 --> 00:02:04,460 And I started talking to Gerry for a master's thesis topic. 45 00:02:04,460 --> 00:02:09,020 And I was talking about AI and software engineering. 46 00:02:09,020 --> 00:02:10,400 I was interested in both. 47 00:02:10,400 --> 00:02:13,340 And I was going to his office as graduate students do 48 00:02:13,340 --> 00:02:17,000 from time to time and trying to find an advisor and so on. 49 00:02:17,000 --> 00:02:19,520 And it turns out that Howie Shrobe, who's also here-- 50 00:02:19,520 --> 00:02:22,970 he's also a fellow-- 51 00:02:22,970 --> 00:02:25,130 had independently been talking to Gerry 52 00:02:25,130 --> 00:02:26,550 about fairly similar things. 53 00:02:26,550 --> 00:02:29,058 And so Jerry said to both of us-- 54 00:02:29,058 --> 00:02:30,350 he introduced us to each other. 55 00:02:30,350 --> 00:02:31,267 He said, listen, guys. 56 00:02:31,267 --> 00:02:33,500 You're both talking about the same sort of thing. 57 00:02:33,500 --> 00:02:35,480 Why don't you do a master's thesis together? 58 00:02:35,480 --> 00:02:39,620 Gerry's a very collaborative person. 59 00:02:39,620 --> 00:02:40,703 And we said, fine. 60 00:02:40,703 --> 00:02:41,870 And he's a professor, right. 61 00:02:41,870 --> 00:02:45,743 Well it turns out it was a great idea for Howie. 62 00:02:45,743 --> 00:02:47,660 We got along great, and we've worked together, 63 00:02:47,660 --> 00:02:49,490 and have been friends for a very long time. 64 00:02:49,490 --> 00:02:53,330 And it was very good from the standpoint of the research. 65 00:02:53,330 --> 00:02:56,210 But it was a disaster from an administrative point of view. 66 00:02:56,210 --> 00:02:59,330 It turns out that really one of the big purposes 67 00:02:59,330 --> 00:03:00,540 of the master's program-- 68 00:03:00,540 --> 00:03:01,790 at least the time I was there. 69 00:03:01,790 --> 00:03:02,960 And maybe it's still here. 70 00:03:02,960 --> 00:03:04,970 You had to do a master's thesis before a PhD. 71 00:03:04,970 --> 00:03:08,360 It was to see if your master's thesis was good enough 72 00:03:08,360 --> 00:03:09,770 to get into the PhD program. 73 00:03:09,770 --> 00:03:11,300 Because you really weren't admitted into the PhD program 74 00:03:11,300 --> 00:03:13,520 until you finished your master's thesis. 75 00:03:13,520 --> 00:03:15,650 And the department was very concerned 76 00:03:15,650 --> 00:03:18,350 about if they had a joint master's thesis, 77 00:03:18,350 --> 00:03:21,720 how are they going to know there wasn't a weak sister. 78 00:03:21,720 --> 00:03:24,830 No feminist offense intended. 79 00:03:24,830 --> 00:03:25,610 That was a term. 80 00:03:25,610 --> 00:03:28,250 How to know who did what and they 81 00:03:28,250 --> 00:03:31,560 wouldn't be able to decide who should go to the PhD program. 82 00:03:31,560 --> 00:03:34,340 And so Gerry was very, very good. 83 00:03:34,340 --> 00:03:36,475 He apparently stood on the desk. 84 00:03:36,475 --> 00:03:37,100 I wasn't there. 85 00:03:37,100 --> 00:03:38,960 But I'm told he actually stood on the desk of the department 86 00:03:38,960 --> 00:03:39,585 chair's office. 87 00:03:39,585 --> 00:03:41,450 And he would resign if they didn't let us 88 00:03:41,450 --> 00:03:42,815 both in the PhD program. 89 00:03:42,815 --> 00:03:45,050 That was an excellent thesis and so on. 90 00:03:45,050 --> 00:03:46,640 And in fact, we did some research. 91 00:03:46,640 --> 00:03:49,880 We were the only joint master's thesis 92 00:03:49,880 --> 00:03:53,120 that had been done in several decades in the EECS Department. 93 00:03:53,120 --> 00:03:56,840 And this was 1975, '76 that we actually had it in. 94 00:03:56,840 --> 00:03:59,390 And I bet you a nickel that we're probably after-- 95 00:03:59,390 --> 00:04:01,430 all the other students saw what we went through. 96 00:04:01,430 --> 00:04:03,710 And they said, oh, god. 97 00:04:03,710 --> 00:04:06,620 You guys almost murdered by this experience. 98 00:04:06,620 --> 00:04:10,370 I'm sure no one ever attempted a joint master's thesis again. 99 00:04:10,370 --> 00:04:14,540 So an interesting recollection there or claim to fame. 100 00:04:14,540 --> 00:04:16,040 I guess a couple other things to say 101 00:04:16,040 --> 00:04:19,709 what people that inspired you. 102 00:04:19,709 --> 00:04:22,010 So Gerry Sussman, who is my thesis advisor, 103 00:04:22,010 --> 00:04:25,050 both for a master's and a PhD, certainly 104 00:04:25,050 --> 00:04:26,660 is a tremendous inspiration. 105 00:04:26,660 --> 00:04:28,910 And particularly the one thing that I find just really 106 00:04:28,910 --> 00:04:34,890 admirable about Gerry, GJS, and his research approach, 107 00:04:34,890 --> 00:04:38,090 which I confess I haven't been able to duplicate, 108 00:04:38,090 --> 00:04:44,030 is he was, and probably is, a master at 109 00:04:44,030 --> 00:04:48,210 embedding a really big idea in a really small program. 110 00:04:48,210 --> 00:04:49,130 So when I was there-- 111 00:04:49,130 --> 00:04:50,422 things like maintenance system. 112 00:04:50,422 --> 00:04:52,820 So you'd be there one afternoon and we'd 113 00:04:52,820 --> 00:04:55,548 be talking about something, he and John Doyle 114 00:04:55,548 --> 00:04:56,840 and some of the other students. 115 00:04:56,840 --> 00:04:59,660 And he stayed up all night with one of the other students. 116 00:04:59,660 --> 00:05:02,180 The next morning he'd have this program that did it. 117 00:05:02,180 --> 00:05:05,690 And it would be a paper or a thesis. 118 00:05:05,690 --> 00:05:09,200 It was just really a real knack for how 119 00:05:09,200 --> 00:05:11,690 to-- and you see it in the scheme 120 00:05:11,690 --> 00:05:13,100 book and the whole approach. 121 00:05:13,100 --> 00:05:16,340 He has a way of distilling ideas down 122 00:05:16,340 --> 00:05:18,785 into a really tight computational framework. 123 00:05:18,785 --> 00:05:20,060 I, on the other hand-- 124 00:05:20,060 --> 00:05:22,160 well I hope I have big ideas. 125 00:05:22,160 --> 00:05:25,790 But I'm lucky if I get them into medium size programs. 126 00:05:25,790 --> 00:05:27,213 More often, I have big ideas. 127 00:05:27,213 --> 00:05:28,130 They're huge programs. 128 00:05:28,130 --> 00:05:30,713 And it's a lot more work and a lot less successful to do that. 129 00:05:30,713 --> 00:05:35,000 So Gerry was just a terrific example in that regard. 130 00:05:35,000 --> 00:05:37,460 I guess the other comment that I was thinking 131 00:05:37,460 --> 00:05:38,840 about in this interview-- 132 00:05:38,840 --> 00:05:42,800 I wanted to make about those years, those 18 years 133 00:05:42,800 --> 00:05:46,886 at the eighth floor in Tech Square. 134 00:05:46,886 --> 00:05:49,970 It was a different social sense, particularly 135 00:05:49,970 --> 00:05:54,420 in the early time, late '70s and early '80s. 136 00:05:54,420 --> 00:05:54,920 Yeah. 137 00:05:54,920 --> 00:05:57,140 At that point, I'm going to sound old fogey, 138 00:05:57,140 --> 00:05:58,520 but there's something that was really special then. 139 00:05:58,520 --> 00:06:00,690 And maybe it was just a part of being a small lab. 140 00:06:00,690 --> 00:06:03,230 So first of all, at that point, each year 141 00:06:03,230 --> 00:06:05,330 there were a handful of new graduate students, two 142 00:06:05,330 --> 00:06:06,230 or three. 143 00:06:06,230 --> 00:06:07,107 It was a big deal. 144 00:06:07,107 --> 00:06:08,690 You never know who's going to show up. 145 00:06:08,690 --> 00:06:13,220 I remember going into Marvin's secretary's office 146 00:06:13,220 --> 00:06:15,680 and looking, peeking at her desk to see 147 00:06:15,680 --> 00:06:18,380 the resumes of the incoming graduate students this year, 148 00:06:18,380 --> 00:06:20,660 that kind of thing. 149 00:06:20,660 --> 00:06:23,540 Every person seemed to really matter more. 150 00:06:23,540 --> 00:06:27,230 You also had the feeling that each new PhD 151 00:06:27,230 --> 00:06:29,915 thesis was like a whole new paradigm, which 152 00:06:29,915 --> 00:06:30,913 is good and bad. 153 00:06:30,913 --> 00:06:32,330 In many ways, in sceince, you need 154 00:06:32,330 --> 00:06:33,710 to build on other people's work. 155 00:06:33,710 --> 00:06:36,110 But there was a certain naivete at that point 156 00:06:36,110 --> 00:06:38,885 that the next big thing was the next PhD thesis. 157 00:06:38,885 --> 00:06:40,760 In fact, one thing I really remember strongly 158 00:06:40,760 --> 00:06:43,060 is in the playroom, which was the open area 159 00:06:43,060 --> 00:06:45,440 on the eighth floor, which was kind 160 00:06:45,440 --> 00:06:48,080 of the heart of the AI Lab. 161 00:06:48,080 --> 00:06:49,943 There were beanbag chairs, and director 162 00:06:49,943 --> 00:06:51,110 chairs, and stuff like that. 163 00:06:51,110 --> 00:06:52,970 There was a table, a big plant table. 164 00:06:52,970 --> 00:06:56,930 And one thing that I have a really strong memory of, 165 00:06:56,930 --> 00:06:59,180 particularly when I was a first year graduate student, 166 00:06:59,180 --> 00:07:04,980 was that on the table-- and I would go in there at lunchtime. 167 00:07:04,980 --> 00:07:07,290 We would each lunch in there, sandwiches or whatever. 168 00:07:07,290 --> 00:07:10,860 And on the table would be a copy of Scott Fahlman's master's 169 00:07:10,860 --> 00:07:12,630 thesis that he just finished working on. 170 00:07:12,630 --> 00:07:14,070 He hadn't quite handed it in yet, 171 00:07:14,070 --> 00:07:16,140 like the second, or third, or last draft. 172 00:07:16,140 --> 00:07:18,480 And he put it there, and we would read it. 173 00:07:18,480 --> 00:07:20,210 It was like an important thing. 174 00:07:20,210 --> 00:07:21,960 And you'd pick up Scott's thesis and you'd 175 00:07:21,960 --> 00:07:23,800 read it, and talk about it, and talk to him. 176 00:07:23,800 --> 00:07:29,250 And it was a real sense of important things happening. 177 00:07:29,250 --> 00:07:32,222 I remember reading Howard Goldstein's PhD thesis 178 00:07:32,222 --> 00:07:33,180 when it first came out. 179 00:07:33,180 --> 00:07:34,763 And there it was sitting on the table. 180 00:07:34,763 --> 00:07:36,000 And we had the advanced copy. 181 00:07:36,000 --> 00:07:37,770 And it was something we looked forward to. 182 00:07:37,770 --> 00:07:41,760 We really felt part of a process and excitement. 183 00:07:41,760 --> 00:07:45,750 And this was important to understand that-- 184 00:07:45,750 --> 00:07:48,480 later on, about the time I left in '91, 185 00:07:48,480 --> 00:07:52,890 it was much more business as usual. 186 00:07:52,890 --> 00:07:56,100 It was lab, and there were offices, 187 00:07:56,100 --> 00:07:59,310 and people working with their advisors and so on. 188 00:07:59,310 --> 00:08:03,000 But there wasn't that sense of focus and this feeling 189 00:08:03,000 --> 00:08:04,590 that this was the next-- 190 00:08:04,590 --> 00:08:06,310 that every thesis was the next big thing. 191 00:08:06,310 --> 00:08:08,700 And I sort of miss that. 192 00:08:08,700 --> 00:08:10,320 And maybe it's just the way the field 193 00:08:10,320 --> 00:08:13,540 itself has changed, for better and for worse. 194 00:08:13,540 --> 00:08:16,770 Why do you think that is that the shift took place? 195 00:08:16,770 --> 00:08:19,440 I guess, I think there are two things. 196 00:08:19,440 --> 00:08:22,140 One is I think that the beginnings of things 197 00:08:22,140 --> 00:08:22,790 are special. 198 00:08:22,790 --> 00:08:24,040 You see that in a lot of ways. 199 00:08:24,040 --> 00:08:26,100 And that was closer to the beginning, 200 00:08:26,100 --> 00:08:28,530 and things are smaller, and that's just true. 201 00:08:28,530 --> 00:08:31,260 In the lab I'm in now, I was one of the founders of the lab. 202 00:08:31,260 --> 00:08:33,960 And the first three, or four, or five years, 203 00:08:33,960 --> 00:08:36,270 there were five people, 10 people, and then 15 people. 204 00:08:36,270 --> 00:08:38,159 It was really special. 205 00:08:38,159 --> 00:08:40,020 And now we're like 75 people. 206 00:08:40,020 --> 00:08:42,037 And it's not a bad place or anything. 207 00:08:42,037 --> 00:08:42,870 It's just different. 208 00:08:42,870 --> 00:08:44,070 Anybody who studies organizations 209 00:08:44,070 --> 00:08:45,000 will tell you that. 210 00:08:45,000 --> 00:08:48,510 Also, there's this whole issue of trying 211 00:08:48,510 --> 00:08:49,650 to be more scientific. 212 00:08:49,650 --> 00:08:52,240 And that is good. 213 00:08:52,240 --> 00:08:57,150 The idea that each PhD should be a whole new paradigm is fun. 214 00:08:57,150 --> 00:08:58,517 But it's not exactly-- 215 00:08:58,517 --> 00:08:59,100 it's not good. 216 00:08:59,100 --> 00:09:01,830 It's got its bad side to it too, because you could never-- 217 00:09:01,830 --> 00:09:03,455 we aren't complaining-- you could never 218 00:09:03,455 --> 00:09:05,610 build anybody else's code or anybody else's work. 219 00:09:05,610 --> 00:09:07,480 And that was a problem. 220 00:09:07,480 --> 00:09:09,750 Also another big effect, just at that time, 221 00:09:09,750 --> 00:09:13,830 was around 1980 David Maher arrived at the AI Lab. 222 00:09:13,830 --> 00:09:16,530 I was there. 223 00:09:16,530 --> 00:09:18,360 If you're a researcher, his name looms 224 00:09:18,360 --> 00:09:20,830 large in the history of the field. 225 00:09:20,830 --> 00:09:25,980 He really had a big effect on the lab, 226 00:09:25,980 --> 00:09:28,010 I think positive and negative. 227 00:09:28,010 --> 00:09:31,680 228 00:09:31,680 --> 00:09:34,057 Up to that point, vision research-- 229 00:09:34,057 --> 00:09:36,640 a little bit except for the work that Berthold Horn was doing, 230 00:09:36,640 --> 00:09:38,848 which was always very mathematical and very rigorous. 231 00:09:38,848 --> 00:09:42,270 But a lot of the vision research was really quick and dirty 232 00:09:42,270 --> 00:09:44,460 and fairly ad hoc, like all AI research 233 00:09:44,460 --> 00:09:49,680 was really, good old fashioned AI style. 234 00:09:49,680 --> 00:09:53,790 And David showed how you could do it differently, how 235 00:09:53,790 --> 00:09:56,190 you could really be scientific. 236 00:09:56,190 --> 00:10:00,533 And there's a lot of talk that science was like a big hammer 237 00:10:00,533 --> 00:10:02,700 that you were going to be banging people in the head 238 00:10:02,700 --> 00:10:03,200 with. 239 00:10:03,200 --> 00:10:04,800 And there's a lot, I think, of envy 240 00:10:04,800 --> 00:10:08,550 of how much success and progress, clear progress, 241 00:10:08,550 --> 00:10:10,980 David's students were making, because he was really 242 00:10:10,980 --> 00:10:15,000 making them be very rigorous, and very mathematical 243 00:10:15,000 --> 00:10:15,557 and son on. 244 00:10:15,557 --> 00:10:16,890 They pushed the vision problems. 245 00:10:16,890 --> 00:10:19,050 They were making progress by leaps and bounds. 246 00:10:19,050 --> 00:10:21,240 And there's a certain sense of superiority. 247 00:10:21,240 --> 00:10:24,420 They were more scientific than the rest of us scruffy people. 248 00:10:24,420 --> 00:10:28,710 And there's an envy and desire to imitate, that kind of thing. 249 00:10:28,710 --> 00:10:31,170 Marvin complained bitterly. 250 00:10:31,170 --> 00:10:32,155 I think he regretted. 251 00:10:32,155 --> 00:10:33,780 He said to me, well at least in secret, 252 00:10:33,780 --> 00:10:36,580 that he brought David Maher in, because he thought very highly 253 00:10:36,580 --> 00:10:37,080 of him. 254 00:10:37,080 --> 00:10:39,330 I don't know what the context was exactly. 255 00:10:39,330 --> 00:10:42,150 And I think he regretted it. 256 00:10:42,150 --> 00:10:45,420 Because basically, everyone tried to become a David Maher. 257 00:10:45,420 --> 00:10:49,390 And that wasn't maybe the best thing for the field. 258 00:10:49,390 --> 00:10:51,022 You need a little bit of diversity. 259 00:10:51,022 --> 00:10:52,980 And we shouldn't all try to do things that way. 260 00:10:52,980 --> 00:10:54,813 That might have worked at that point in time 261 00:10:54,813 --> 00:10:58,830 for that kind of a problem in that area. 262 00:10:58,830 --> 00:11:02,740 But it was-- 263 00:11:02,740 --> 00:11:04,450 I think that was the period when there 264 00:11:04,450 --> 00:11:07,510 began to be a shift towards trying to make 265 00:11:07,510 --> 00:11:10,480 AI more of a normal science. 266 00:11:10,480 --> 00:11:14,980 And I think David's presence and David's example, 267 00:11:14,980 --> 00:11:18,560 positive example in most ways, in many ways, I think, 268 00:11:18,560 --> 00:11:21,460 had a lot to do with that shift, as well 269 00:11:21,460 --> 00:11:23,625 as just the inevitable developing 270 00:11:23,625 --> 00:11:26,470 and the maturing of the field. 271 00:11:26,470 --> 00:11:29,200 So shifting gears a little, what's 272 00:11:29,200 --> 00:11:32,020 the most challenging thing you've worked on personally 273 00:11:32,020 --> 00:11:33,640 since you've been here? 274 00:11:33,640 --> 00:11:35,640 I guess the most challenging area I've worked on 275 00:11:35,640 --> 00:11:37,150 is theorem proving. 276 00:11:37,150 --> 00:11:40,060 I guess as part of the Programmer's Apprentice 277 00:11:40,060 --> 00:11:43,120 project, we were doing how to represent knowledge, 278 00:11:43,120 --> 00:11:45,850 basically what programmers know about programs, the kind 279 00:11:45,850 --> 00:11:47,270 of architectures and so on. 280 00:11:47,270 --> 00:11:49,360 And then we wanted to build a system, a tool, that 281 00:11:49,360 --> 00:11:51,730 would help programmers analyze their programs 282 00:11:51,730 --> 00:11:53,200 and synthesize them and so on. 283 00:11:53,200 --> 00:11:55,910 And that involved automated deduction. 284 00:11:55,910 --> 00:12:00,130 And I ended up writing a kind of an automated deduction 285 00:12:00,130 --> 00:12:01,240 theory proving system. 286 00:12:01,240 --> 00:12:05,800 And I must say it basically scared me off 287 00:12:05,800 --> 00:12:07,570 of that for the rest of my career. 288 00:12:07,570 --> 00:12:10,540 Because what I discovered was that-- 289 00:12:10,540 --> 00:12:13,160 and people just probably to do this better than I do. 290 00:12:13,160 --> 00:12:17,590 But my attempts to do it was-- 291 00:12:17,590 --> 00:12:21,790 you write a bunch of rules, or axioms, or procedures 292 00:12:21,790 --> 00:12:22,400 interacted. 293 00:12:22,400 --> 00:12:25,505 And you'd be focused on the path that you 294 00:12:25,505 --> 00:12:27,880 were hoping the system would take on some examples you're 295 00:12:27,880 --> 00:12:28,960 working on. 296 00:12:28,960 --> 00:12:32,620 And you know the system, the computer, whatever 297 00:12:32,620 --> 00:12:34,390 you want to call it, was just so good 298 00:12:34,390 --> 00:12:37,480 at finding another path which you could just count. 299 00:12:37,480 --> 00:12:38,620 It would just count. 300 00:12:38,620 --> 00:12:39,395 Oh, I got one. 301 00:12:39,395 --> 00:12:41,770 Successor, how about this successor to the successor one? 302 00:12:41,770 --> 00:12:44,395 How about this successor to the successor to the successor one? 303 00:12:44,395 --> 00:12:46,300 And somewhere in your framework, there 304 00:12:46,300 --> 00:12:49,870 were some axioms that could count essentially. 305 00:12:49,870 --> 00:12:52,005 Then you system would spend all this time counting 306 00:12:52,005 --> 00:12:53,800 and not do any useful work at all. 307 00:12:53,800 --> 00:12:57,520 So there's this feeling that these systems, theorem proving 308 00:12:57,520 --> 00:13:00,890 systems, could just run away from you really easily. 309 00:13:00,890 --> 00:13:03,220 And so since then I sort of ended up 310 00:13:03,220 --> 00:13:10,780 adopting various techniques for incomplete but efficient 311 00:13:10,780 --> 00:13:12,167 bounded inference techniques. 312 00:13:12,167 --> 00:13:13,750 And I put one more plug in for someone 313 00:13:13,750 --> 00:13:17,500 who influenced me on that one actually, which is-- 314 00:13:17,500 --> 00:13:18,230 since he's here. 315 00:13:18,230 --> 00:13:20,740 It's David McAllester, who had the office next to me 316 00:13:20,740 --> 00:13:21,910 as a graduate student. 317 00:13:21,910 --> 00:13:34,060 And I am still using code that David wrote in 1979, '78, '79, 318 00:13:34,060 --> 00:13:35,680 originally in Lisp. 319 00:13:35,680 --> 00:13:38,920 And I translated it to another version of Lisp. 320 00:13:38,920 --> 00:13:40,840 And I've translated it into Java. 321 00:13:40,840 --> 00:13:44,680 And I'm still using his code for the equality 322 00:13:44,680 --> 00:13:46,510 system for Boolean TMS. 323 00:13:46,510 --> 00:13:49,660 Actually, it's a combination of a Boolean TMS, 324 00:13:49,660 --> 00:13:51,670 truth maintenance system and equality 325 00:13:51,670 --> 00:13:54,760 system, which still is actually down 326 00:13:54,760 --> 00:13:57,110 deep in the guts of the work I'm doing now, 327 00:13:57,110 --> 00:14:03,400 which is more on interfaces and task-based, collaborative 328 00:14:03,400 --> 00:14:04,670 interfaces. 329 00:14:04,670 --> 00:14:10,060 But down deep in the system is that same algorithm and almost 330 00:14:10,060 --> 00:14:12,790 the same code translated line by line-- 331 00:14:12,790 --> 00:14:15,730 I've made a few improvements-- that David McAllester 332 00:14:15,730 --> 00:14:19,030 wrote in the late '70s. 333 00:14:19,030 --> 00:14:20,650 So I'm a big fan of David's. 334 00:14:20,650 --> 00:14:22,480 What does it do? 335 00:14:22,480 --> 00:14:28,970 It's a Boolean truth mainteance system of equality. 336 00:14:28,970 --> 00:14:32,770 So what it does is it does unit propositional resolution, which 337 00:14:32,770 --> 00:14:35,140 is a sound but incomplete version 338 00:14:35,140 --> 00:14:38,860 of propositional resolution combined 339 00:14:38,860 --> 00:14:43,330 with a complete decision procedure for equality 340 00:14:43,330 --> 00:14:44,210 reasoning. 341 00:14:44,210 --> 00:14:48,190 So it's basically a little inference engine 342 00:14:48,190 --> 00:14:52,210 that is a trade off between efficiency and completeness. 343 00:14:52,210 --> 00:14:54,580 It's incomplete, but it's very efficient. 344 00:14:54,580 --> 00:14:56,590 And it turns out for a lot of the things 345 00:14:56,590 --> 00:14:59,890 that I use it for, subsequently, it seemed 346 00:14:59,890 --> 00:15:01,740 to be just the right thing. 347 00:15:01,740 --> 00:15:05,467 And the code has just been following me 348 00:15:05,467 --> 00:15:07,300 through revisions, and through translations, 349 00:15:07,300 --> 00:15:08,320 and through translation. 350 00:15:08,320 --> 00:15:09,700 I tell David that. 351 00:15:09,700 --> 00:15:11,095 He cringes. 352 00:15:11,095 --> 00:15:12,235 I say, David, it's great. 353 00:15:12,235 --> 00:15:13,440 The code still works. 354 00:15:13,440 --> 00:15:14,990 It was good code from the beginning. 355 00:15:14,990 --> 00:15:15,948 So that's kind of cool. 356 00:15:15,948 --> 00:15:17,870 So I don't know if you work with it much, 357 00:15:17,870 --> 00:15:19,940 but do you have any strong feelings about Prolog? 358 00:15:19,940 --> 00:15:20,440 Yeah. 359 00:15:20,440 --> 00:15:21,580 I do actually. 360 00:15:21,580 --> 00:15:24,970 I think that the right approach is 361 00:15:24,970 --> 00:15:28,330 to have a prolog interpretor written in some other language. 362 00:15:28,330 --> 00:15:29,950 So that for those things for which 363 00:15:29,950 --> 00:15:35,410 you want to use that backtracking logical algorithm, 364 00:15:35,410 --> 00:15:36,620 you could do that. 365 00:15:36,620 --> 00:15:39,443 But you could also do conventional programming 366 00:15:39,443 --> 00:15:40,360 in the same framework. 367 00:15:40,360 --> 00:15:42,777 Because you don't end up doing this ridiculous thing where 368 00:15:42,777 --> 00:15:45,380 you're using little tags to make things sequentialize. 369 00:15:45,380 --> 00:15:48,025 It's always odd, the things you're forced into. 370 00:15:48,025 --> 00:15:51,280 371 00:15:51,280 --> 00:15:55,630 So I think it's not a good general purpose programming 372 00:15:55,630 --> 00:15:56,240 language. 373 00:15:56,240 --> 00:15:58,540 And you know you need to use it when-- 374 00:15:58,540 --> 00:16:03,710 for example, there's a various a Lisp Prolog hybrid system. 375 00:16:03,710 --> 00:16:05,860 Is it LispLog or ProLisp? 376 00:16:05,860 --> 00:16:06,950 I forget what it's called. 377 00:16:06,950 --> 00:16:07,970 ProLisp sounds familiar. 378 00:16:07,970 --> 00:16:08,470 Yeah. 379 00:16:08,470 --> 00:16:08,990 I'm not sure. 380 00:16:08,990 --> 00:16:10,782 I think it's actually the other way around. 381 00:16:10,782 --> 00:16:13,120 But to me, having-- 382 00:16:13,120 --> 00:16:16,570 just like you'd want to have a standard limitation of astar 383 00:16:16,570 --> 00:16:19,120 search available in a library, you'd 384 00:16:19,120 --> 00:16:21,520 want to have a Prolog engine there in your library, 385 00:16:21,520 --> 00:16:22,910 whether it's in Java or in Lisp. 386 00:16:22,910 --> 00:16:24,710 And you want to use it when it's appropriate. 387 00:16:24,710 --> 00:16:26,835 But I don't think you want to just have it be like, 388 00:16:26,835 --> 00:16:28,540 my whole system's written in the Prolog. 389 00:16:28,540 --> 00:16:30,340 Often, you end up doing things that are much easier done 390 00:16:30,340 --> 00:16:31,170 in some other way. 391 00:16:31,170 --> 00:16:32,920 But because you want to keep it in Prolog, 392 00:16:32,920 --> 00:16:34,490 you end up making the code obscure. 393 00:16:34,490 --> 00:16:36,940 So I'm not a big fan in that regard. 394 00:16:36,940 --> 00:16:38,500 OK. 395 00:16:38,500 --> 00:16:43,380 Skipping back a little bit to the idea of the Tech Square 396 00:16:43,380 --> 00:16:46,270 and AI becoming more scientific. 397 00:16:46,270 --> 00:16:50,500 Besides the approach to problems and the mathematical rigor, 398 00:16:50,500 --> 00:16:55,840 I guess, can you think of any maybe specific, maybe not, 399 00:16:55,840 --> 00:17:01,900 overlaps we can incorporate from other fields? 400 00:17:01,900 --> 00:17:05,319 What would we do with that overlap exactly? 401 00:17:05,319 --> 00:17:08,319 Let's see how best to formulate this. 402 00:17:08,319 --> 00:17:14,980 Ways we can apply concepts from a different scientific field. 403 00:17:14,980 --> 00:17:17,349 That happening all the time, a lot 404 00:17:17,349 --> 00:17:23,170 of stuff from utility theory, from economics. 405 00:17:23,170 --> 00:17:26,140 A lot of people are doing that kind of work. 406 00:17:26,140 --> 00:17:32,602 It's not been my own particular forte or initiative, 407 00:17:32,602 --> 00:17:33,310 not particularly. 408 00:17:33,310 --> 00:17:35,370 So I don't have any great insights on that one. 409 00:17:35,370 --> 00:17:36,670 That's OK. 410 00:17:36,670 --> 00:17:38,417 Let's see. 411 00:17:38,417 --> 00:17:40,000 Besides your own work of course, which 412 00:17:40,000 --> 00:17:41,450 is the coolest thing out there. 413 00:17:41,450 --> 00:17:43,158 Because otherwise, why would you be here? 414 00:17:43,158 --> 00:17:44,230 No, no, no, no. 415 00:17:44,230 --> 00:17:47,650 What do you think is the coolest thing going on right now? 416 00:17:47,650 --> 00:17:49,815 If I were-- 417 00:17:49,815 --> 00:17:51,190 Bill said, if I were young again. 418 00:17:51,190 --> 00:17:53,357 I'm not as old as Neal, so I don't have to say that. 419 00:17:53,357 --> 00:17:56,590 But if I was starting fresh out a brand new fresh area, 420 00:17:56,590 --> 00:17:59,260 I guess I'm most interested, and would 421 00:17:59,260 --> 00:18:06,460 be most interested in, the intersection of information 422 00:18:06,460 --> 00:18:11,540 technology, primarily AI, nanotechnology, and biology. 423 00:18:11,540 --> 00:18:14,440 It seems to me that if you look at where those are coming 424 00:18:14,440 --> 00:18:17,080 together, I think that's where, in some sense, 425 00:18:17,080 --> 00:18:19,520 science fiction is really going to happen. 426 00:18:19,520 --> 00:18:24,220 So I just think that that's a very cool place to be. 427 00:18:24,220 --> 00:18:26,880 It would require a lot tooling up on my part to get there. 428 00:18:26,880 --> 00:18:28,720 And I don't know if I have the flexibility 429 00:18:28,720 --> 00:18:30,637 or have the opportunity interest, career wise, 430 00:18:30,637 --> 00:18:33,860 if someone want to give me year to support me, get up to speed. 431 00:18:33,860 --> 00:18:36,580 But if I was starting out fresh, I 432 00:18:36,580 --> 00:18:39,440 would jump into the middle of that intersection. 433 00:18:39,440 --> 00:18:40,737 Can you elaborate? 434 00:18:40,737 --> 00:18:42,070 Or can you explain specifically? 435 00:18:42,070 --> 00:18:42,610 No. 436 00:18:42,610 --> 00:18:44,030 It's not specific. 437 00:18:44,030 --> 00:18:50,890 It's more a general that there's real things have to change. 438 00:18:50,890 --> 00:18:54,970 And the potential, the good matter, is always change. 439 00:18:54,970 --> 00:18:57,260 Fundamentally, how humans live in the world. 440 00:18:57,260 --> 00:18:59,090 I really do. 441 00:18:59,090 --> 00:19:06,310 So looking ahead, how do you think it will affect us? 442 00:19:06,310 --> 00:19:08,170 I hate asking people because they 443 00:19:08,170 --> 00:19:11,160 don't like to say something concrete, but-- 444 00:19:11,160 --> 00:19:13,320 You mean about AI in general, or about-- 445 00:19:13,320 --> 00:19:19,290 Yeah, or specific-- perhaps [INAUDIBLE].. 446 00:19:19,290 --> 00:19:20,510 Yeah-- no, I mean. 447 00:19:20,510 --> 00:19:23,135 448 00:19:23,135 --> 00:19:24,510 You know, what gets me out of bed 449 00:19:24,510 --> 00:19:26,520 in the morning is really I really 450 00:19:26,520 --> 00:19:29,710 want to see if an artificial intelligence before I 451 00:19:29,710 --> 00:19:31,540 leave the planet. 452 00:19:31,540 --> 00:19:35,010 And I think I will. 453 00:19:35,010 --> 00:19:38,160 Assuming I live healthy and so on, maybe another 50 454 00:19:38,160 --> 00:19:41,400 years or something, I absolutely believe 455 00:19:41,400 --> 00:19:45,875 that I will see a time when we'll 456 00:19:45,875 --> 00:19:48,000 be arguing about whether artificial intelligence is 457 00:19:48,000 --> 00:19:48,500 around. 458 00:19:48,500 --> 00:19:51,390 459 00:19:51,390 --> 00:19:55,860 So artificial intelligence in the form of say expert systems 460 00:19:55,860 --> 00:19:59,280 is incorporated into a lot-- 461 00:19:59,280 --> 00:19:59,820 Oh sure. 462 00:19:59,820 --> 00:20:02,220 You can go to the innovative applications of the AI 463 00:20:02,220 --> 00:20:02,823 conference. 464 00:20:02,823 --> 00:20:03,990 There's lots of great stuff. 465 00:20:03,990 --> 00:20:05,282 But I'm not talking about that. 466 00:20:05,282 --> 00:20:07,308 I'm talking about an artificial. 467 00:20:07,308 --> 00:20:08,850 That way I talk to my students, and I 468 00:20:08,850 --> 00:20:11,400 had students that had recently been in industry mostly, is-- 469 00:20:11,400 --> 00:20:14,220 470 00:20:14,220 --> 00:20:16,650 basically a program, or an entity, 471 00:20:16,650 --> 00:20:19,710 a physically embodied robot, or a body agent 472 00:20:19,710 --> 00:20:23,730 on screen or something for which there's something there. 473 00:20:23,730 --> 00:20:25,920 Sorry I was I should have made it clear. 474 00:20:25,920 --> 00:20:28,530 I was switching gears. 475 00:20:28,530 --> 00:20:31,950 I know that these sort of systems have been implemented. 476 00:20:31,950 --> 00:20:33,630 And they're behind the scenes in a lot 477 00:20:33,630 --> 00:20:34,713 of the things you've seen. 478 00:20:34,713 --> 00:20:36,870 479 00:20:36,870 --> 00:20:40,050 Are people, do you think, going to end up 480 00:20:40,050 --> 00:20:42,970 interacting with something that specifically identify as AI, 481 00:20:42,970 --> 00:20:44,920 like this computer I'm talking. 482 00:20:44,920 --> 00:20:45,840 Yeah. 483 00:20:45,840 --> 00:20:48,080 I think so. 484 00:20:48,080 --> 00:20:51,240 Just look at avatar and so on, the face. 485 00:20:51,240 --> 00:20:52,110 Yeah, absolutely. 486 00:20:52,110 --> 00:20:53,250 People want to. 487 00:20:53,250 --> 00:20:55,680 It's absolutely clear. 488 00:20:55,680 --> 00:20:57,990 Someone stood up in the earlier session this morning 489 00:20:57,990 --> 00:21:00,540 and talked about your books are getting 490 00:21:00,540 --> 00:21:04,770 some huge amount of money in euros for building 491 00:21:04,770 --> 00:21:06,960 companions for older people. 492 00:21:06,960 --> 00:21:09,480 I think it's absolutely going to happen, 493 00:21:09,480 --> 00:21:10,920 and sooner than we think. 494 00:21:10,920 --> 00:21:14,320 495 00:21:14,320 --> 00:21:17,260 I should get back to a session. 496 00:21:17,260 --> 00:21:18,610 Thanks for talking to us today. 497 00:21:18,610 --> 00:21:18,880 Yeah. 498 00:21:18,880 --> 00:21:19,380 Thanks. 499 00:21:19,380 --> 00:21:22,516 And I'll talk to you later. 500 00:21:22,516 --> 00:21:23,016