1 00:00:00,000 --> 00:00:09,120 2 00:00:09,120 --> 00:00:09,620 All right. 3 00:00:09,620 --> 00:00:13,230 So like you said, could you introduce yourself? 4 00:00:13,230 --> 00:00:14,600 My name is Ron Brachman. 5 00:00:14,600 --> 00:00:16,800 I've been in the field for quite a long time. 6 00:00:16,800 --> 00:00:21,450 I got my PhD from Harvard in 1977. 7 00:00:21,450 --> 00:00:24,710 And I have worked many interesting jobs since then 8 00:00:24,710 --> 00:00:28,730 and pretty much been in AI for my entire technical career. 9 00:00:28,730 --> 00:00:32,540 And most recently, I am the most immediate past president 10 00:00:32,540 --> 00:00:35,870 of AAAI, and I've also worked very extensively 11 00:00:35,870 --> 00:00:39,800 with IJCAI and SIGART, and a lot of the other organizations 12 00:00:39,800 --> 00:00:43,310 involved in making AI as a field successful 13 00:00:43,310 --> 00:00:46,160 and supporting researchers, teachers, 14 00:00:46,160 --> 00:00:47,700 and students in the field. 15 00:00:47,700 --> 00:00:48,410 Cool. 16 00:00:48,410 --> 00:00:51,860 So what got you interested in artificial intelligence? 17 00:00:51,860 --> 00:00:53,570 How did you get started with all this? 18 00:00:53,570 --> 00:00:56,150 I'm sure, like many people-- when I was a kid, 19 00:00:56,150 --> 00:00:58,460 I was very interested in science fiction. 20 00:00:58,460 --> 00:01:02,000 And I remember, my dad was an electrical engineer. 21 00:01:02,000 --> 00:01:04,197 And actually, now that I'm much older, 22 00:01:04,197 --> 00:01:05,780 I found out some of the things he did. 23 00:01:05,780 --> 00:01:08,480 He actually worked for the army and ended up 24 00:01:08,480 --> 00:01:11,510 supporting technical research in certain places-- 25 00:01:11,510 --> 00:01:15,140 sort of, contract research, the way we know it now. 26 00:01:15,140 --> 00:01:17,360 And he was doing things like supporting work 27 00:01:17,360 --> 00:01:19,850 on pattern recognition and trying 28 00:01:19,850 --> 00:01:24,620 to replicate the kind of capability 29 00:01:24,620 --> 00:01:27,470 that a very good mechanic has, where he or she could 30 00:01:27,470 --> 00:01:29,810 listen to the engine of a car and end up 31 00:01:29,810 --> 00:01:31,760 with a pretty good idea of what's wrong just 32 00:01:31,760 --> 00:01:33,982 by doing that kind of pattern recognition. 33 00:01:33,982 --> 00:01:36,440 The other thing we did is, we had a lot of electronic stuff 34 00:01:36,440 --> 00:01:37,590 in the house. 35 00:01:37,590 --> 00:01:40,640 And I don't remember the year, but for one of the science 36 00:01:40,640 --> 00:01:43,740 fairs, probably in high school, with my dad, 37 00:01:43,740 --> 00:01:47,000 we took out a Popular Science or Popular Electronics article, 38 00:01:47,000 --> 00:01:50,060 and built a little robot that would follow-- 39 00:01:50,060 --> 00:01:51,890 you put some white tape on the floor, 40 00:01:51,890 --> 00:01:55,280 and use a little light bulb, which would reflect off 41 00:01:55,280 --> 00:01:57,435 the floor, and then a sensor. 42 00:01:57,435 --> 00:01:59,810 And it would determine which way the robot's wheels would 43 00:01:59,810 --> 00:02:01,820 turn, based on whether or not it apparently 44 00:02:01,820 --> 00:02:03,530 had a lot of reflection. 45 00:02:03,530 --> 00:02:07,640 So it would turn one way until it hit the white tape, 46 00:02:07,640 --> 00:02:09,485 and then when it detected a switch, 47 00:02:09,485 --> 00:02:11,035 it would turn it would go this way. 48 00:02:11,035 --> 00:02:13,160 So you'd follow a path all the way around the room. 49 00:02:13,160 --> 00:02:15,020 And-- sort of, got interested in seeing 50 00:02:15,020 --> 00:02:19,400 a little, very unintelligent robot doing what it might do. 51 00:02:19,400 --> 00:02:21,500 And then, with my interest in science fiction-- 52 00:02:21,500 --> 00:02:24,320 just enjoying watching TV and reading science fiction books, 53 00:02:24,320 --> 00:02:26,100 pretty much, my entire life-- 54 00:02:26,100 --> 00:02:29,210 it was a natural thing when I got the opportunity 55 00:02:29,210 --> 00:02:34,730 to at least consider intelligent computation and, I guess, 56 00:02:34,730 --> 00:02:37,030 what eventually became AI. 57 00:02:37,030 --> 00:02:38,030 Now, I have to tell you. 58 00:02:38,030 --> 00:02:42,260 When I graduated from high school, I went to college. 59 00:02:42,260 --> 00:02:43,250 I went to Princeton. 60 00:02:43,250 --> 00:02:45,410 There was no computer science department. 61 00:02:45,410 --> 00:02:48,072 It was before the days where many institutions had 62 00:02:48,072 --> 00:02:49,280 computer science departments. 63 00:02:49,280 --> 00:02:51,890 So people like me would tend to major 64 00:02:51,890 --> 00:02:53,660 in electrical engineering, and you'd 65 00:02:53,660 --> 00:02:55,970 get a pretty heavy dose of analog circuitry 66 00:02:55,970 --> 00:02:57,920 and a little bit about digital computers. 67 00:02:57,920 --> 00:02:59,840 And it was back in the days, when 68 00:02:59,840 --> 00:03:03,110 people would this [INAUDIBLE] programmed machines. 69 00:03:03,110 --> 00:03:07,070 But there was an inkling on the horizon of AI-type stuff. 70 00:03:07,070 --> 00:03:09,170 And I ended up going to graduate school 71 00:03:09,170 --> 00:03:12,470 at Harvard, which was certainly, at that time, not 72 00:03:12,470 --> 00:03:14,900 a hotbed of artificial intelligence, 73 00:03:14,900 --> 00:03:17,180 although now, there's a lot of really good work going 74 00:03:17,180 --> 00:03:19,160 on there in the field. 75 00:03:19,160 --> 00:03:21,440 But I got very lucky in meeting up 76 00:03:21,440 --> 00:03:25,190 with the person who became my thesis advisor-- 77 00:03:25,190 --> 00:03:26,345 somebody named Bill Woods. 78 00:03:26,345 --> 00:03:28,220 He was a very well-known person in the field. 79 00:03:28,220 --> 00:03:30,020 He has made some major contributions 80 00:03:30,020 --> 00:03:31,970 to natural language processing. 81 00:03:31,970 --> 00:03:35,000 And Bill taught courses in natural language 82 00:03:35,000 --> 00:03:37,850 and information retrieval, and was, kind of, 83 00:03:37,850 --> 00:03:40,550 the window I had into AI at the very beginning 84 00:03:40,550 --> 00:03:42,380 of graduate school. 85 00:03:42,380 --> 00:03:43,940 The great thing, though-- 86 00:03:43,940 --> 00:03:47,150 to continue here-- about going to graduate school at Harvard, 87 00:03:47,150 --> 00:03:49,010 at least at the time, was that we 88 00:03:49,010 --> 00:03:52,200 were allowed to freely cross-register at MIT. 89 00:03:52,200 --> 00:03:56,790 And this was-- I started graduate school in 1971. 90 00:03:56,790 --> 00:03:59,930 And this was a time where Marvin Minsky and Seymour Papert 91 00:03:59,930 --> 00:04:02,000 were very active in my lab. 92 00:04:02,000 --> 00:04:06,020 And there was a huge number of really excellent graduate 93 00:04:06,020 --> 00:04:08,750 students and young faculty there, 94 00:04:08,750 --> 00:04:11,220 and it was a very, very exciting time for the field. 95 00:04:11,220 --> 00:04:15,590 So this was early '70s into the early '80s-- 96 00:04:15,590 --> 00:04:17,779 both getting my degree, and then afterwards. 97 00:04:17,779 --> 00:04:20,630 So I remember going over there and taking courses 98 00:04:20,630 --> 00:04:23,730 at MIT of a sort that weren't given at Harvard. 99 00:04:23,730 --> 00:04:26,210 So I got to take the Introduction to AI course 100 00:04:26,210 --> 00:04:29,660 from Marvin and Seymour, which was really outstanding, 101 00:04:29,660 --> 00:04:31,650 and I have some fond memories of that. 102 00:04:31,650 --> 00:04:33,320 I know I took a number of seminars. 103 00:04:33,320 --> 00:04:35,300 I took a course from Bill Martin. 104 00:04:35,300 --> 00:04:38,570 That was, sort of, early hints of knowledge representation, 105 00:04:38,570 --> 00:04:42,140 which eventually became the field that I got into. 106 00:04:42,140 --> 00:04:45,020 I can't remember whether I took a course from him 107 00:04:45,020 --> 00:04:47,930 or just met and spent time with Terry Winograd. 108 00:04:47,930 --> 00:04:50,070 This was right after his thesis was published, 109 00:04:50,070 --> 00:04:52,200 and there was a great deal of excitement. 110 00:04:52,200 --> 00:04:54,530 And then, there were a host of other really top-notch 111 00:04:54,530 --> 00:04:56,822 students, as I mentioned, there, who eventually went on 112 00:04:56,822 --> 00:04:59,030 to be very strong contributors to the field, 113 00:04:59,030 --> 00:05:01,730 like Drew McDermott, Bob Moore, and others, who 114 00:05:01,730 --> 00:05:03,410 are some of my contemporaries. 115 00:05:03,410 --> 00:05:05,750 So in a way, I really lucked out. 116 00:05:05,750 --> 00:05:08,120 Because I got a great education at Harvard 117 00:05:08,120 --> 00:05:10,780 with a fantastic advisor, and yet, I 118 00:05:10,780 --> 00:05:13,840 could take all the courses I could possibly 119 00:05:13,840 --> 00:05:17,080 drink in over at MIT, and get, especially 120 00:05:17,080 --> 00:05:19,910 from the masters, the real essence of AI, 121 00:05:19,910 --> 00:05:21,670 as they were thinking of it at the time. 122 00:05:21,670 --> 00:05:22,630 Very nice. 123 00:05:22,630 --> 00:05:28,420 So I guess, after you got your PhD, 124 00:05:28,420 --> 00:05:30,400 you stuck around for a while. 125 00:05:30,400 --> 00:05:33,340 Where did you go then? 126 00:05:33,340 --> 00:05:36,070 One thing, again, that was very fortunate for me-- 127 00:05:36,070 --> 00:05:38,830 in Cambridge, there was a company called, 128 00:05:38,830 --> 00:05:40,690 Bolt Beranek and Newman. 129 00:05:40,690 --> 00:05:43,390 Since, they typically just use the name, BBN. 130 00:05:43,390 --> 00:05:46,690 But in any case, Bolt Beranek and Newman started in the '40s. 131 00:05:46,690 --> 00:05:49,690 And originally, they were an architectural acoustics firm, 132 00:05:49,690 --> 00:05:51,940 but they very quickly got into a significant amount 133 00:05:51,940 --> 00:05:54,190 of electronics and computation. 134 00:05:54,190 --> 00:05:57,700 And in fact, you may recall that they were a major, major player 135 00:05:57,700 --> 00:06:00,340 in the development and deployment of the ARPANET, 136 00:06:00,340 --> 00:06:03,610 and very famous people like Bob Kahn and others 137 00:06:03,610 --> 00:06:07,840 were there during the heyday of the ARPANET. 138 00:06:07,840 --> 00:06:11,170 They pre-dated me a bit, but it was also a very exciting place 139 00:06:11,170 --> 00:06:12,700 to get involved with. 140 00:06:12,700 --> 00:06:16,960 It turns out that I had two different connections with BBN. 141 00:06:16,960 --> 00:06:19,630 One is that when Bill Woods was teaching these courses 142 00:06:19,630 --> 00:06:20,270 at Harvard. 143 00:06:20,270 --> 00:06:24,520 He was actually a lecturer and was employed full-time at BBN. 144 00:06:24,520 --> 00:06:27,610 He was the head of the AI department there. 145 00:06:27,610 --> 00:06:29,710 So I had a connection to the company through Bill. 146 00:06:29,710 --> 00:06:33,520 And then, secondly, I actually took a computer graphics course 147 00:06:33,520 --> 00:06:35,680 my first year as a graduate student, 148 00:06:35,680 --> 00:06:39,015 and ended up befriending one of the people who 149 00:06:39,015 --> 00:06:40,390 are-- in fact, several people who 150 00:06:40,390 --> 00:06:42,070 worked at BBN on some extraordinarily 151 00:06:42,070 --> 00:06:44,260 interesting graphics projects. 152 00:06:44,260 --> 00:06:48,190 So I was very lucky and got to work part-time at BBN, 153 00:06:48,190 --> 00:06:50,380 starting as early as 72. 154 00:06:50,380 --> 00:06:53,980 And I wasn't doing any AI work in this part-time job, 155 00:06:53,980 --> 00:06:57,310 but I was getting to meet people like Allan Collins, John Seely 156 00:06:57,310 --> 00:07:02,860 Brown, Richard Burton, Wally Feurzeig, Rusty Bobrow, 157 00:07:02,860 --> 00:07:05,800 and a host of other people who had very significant 158 00:07:05,800 --> 00:07:08,290 interaction with AI. 159 00:07:08,290 --> 00:07:12,340 And so I was in that milieu, and Bill was my advisor. 160 00:07:12,340 --> 00:07:15,850 And so by the time I finished my degree, 161 00:07:15,850 --> 00:07:17,560 Bill was very interested in the work. 162 00:07:17,560 --> 00:07:24,130 And my thesis was about a fundamentally rethought view 163 00:07:24,130 --> 00:07:25,690 of knowledge representation, trying 164 00:07:25,690 --> 00:07:27,790 to understand how it could be very firmly based 165 00:07:27,790 --> 00:07:30,520 in mathematics and not a lot of handwaving 166 00:07:30,520 --> 00:07:33,850 English words floating around on network diagrams. 167 00:07:33,850 --> 00:07:37,720 And some of the work that Bill started to do at BBN 168 00:07:37,720 --> 00:07:39,033 was based on this. 169 00:07:39,033 --> 00:07:40,450 And in fact, interestingly enough, 170 00:07:40,450 --> 00:07:43,270 connection to Bob Kahn-- 171 00:07:43,270 --> 00:07:46,940 Bob was the program manager and subsequently 172 00:07:46,940 --> 00:07:50,620 an office director at DARPA, who was the sponsor of Bill's 173 00:07:50,620 --> 00:07:51,430 project at BBN. 174 00:07:51,430 --> 00:07:52,330 Oh, OK. 175 00:07:52,330 --> 00:07:54,710 And so I met Bob as a graduate student. 176 00:07:54,710 --> 00:07:57,970 And then, eventually, it was just a very natural thing 177 00:07:57,970 --> 00:07:59,740 to finish my degree and to go work 178 00:07:59,740 --> 00:08:01,960 as a full-time employee at BBN. 179 00:08:01,960 --> 00:08:04,630 And then, I could change over from this part-time work 180 00:08:04,630 --> 00:08:08,770 on computer graphics to being a full-time professional AI 181 00:08:08,770 --> 00:08:13,510 employee, and working on a somewhat elaborate 182 00:08:13,510 --> 00:08:15,580 implemented knowledge representation system 183 00:08:15,580 --> 00:08:19,480 that eventually came to be called KL-ONE, which we used 184 00:08:19,480 --> 00:08:23,440 in various applications for natural language processing 185 00:08:23,440 --> 00:08:25,160 and other types of reasoning. 186 00:08:25,160 --> 00:08:28,090 And it was a really phenomenal time to be there. 187 00:08:28,090 --> 00:08:30,520 First of all, you could keep up your connections with MIT, 188 00:08:30,520 --> 00:08:32,919 because it was all in Cambridge. 189 00:08:32,919 --> 00:08:35,169 And secondly, there was this wonderful group 190 00:08:35,169 --> 00:08:36,850 of people at BBN-- 191 00:08:36,850 --> 00:08:38,809 Candy Sidner and Phil Cohen were there. 192 00:08:38,809 --> 00:08:42,490 Hector Levesque came and visited. 193 00:08:42,490 --> 00:08:46,420 Brian Smith, who was at MIT at the time, would be over there. 194 00:08:46,420 --> 00:08:49,120 Mitch Marcus, who was a very important natural language 195 00:08:49,120 --> 00:08:50,360 person, was there. 196 00:08:50,360 --> 00:08:53,080 So it was a very, very invigorating, very exciting 197 00:08:53,080 --> 00:08:55,150 time to be at BBN. 198 00:08:55,150 --> 00:08:59,350 And so I stayed there, getting to work on my first love, which 199 00:08:59,350 --> 00:09:01,360 is basically the follow-on from my thesis 200 00:09:01,360 --> 00:09:04,120 and the implementation of this KL-ONE system, 201 00:09:04,120 --> 00:09:06,640 for, about, four years, full-time. 202 00:09:06,640 --> 00:09:09,700 And then, I left the Boston area after living there 203 00:09:09,700 --> 00:09:13,720 for 10 years, getting recruited to go out to the West Coast 204 00:09:13,720 --> 00:09:19,510 to work for a new AI lab that was grown under Schlumberger, 205 00:09:19,510 --> 00:09:22,540 which is a French oil services company [INAUDIBLE] put 206 00:09:22,540 --> 00:09:24,070 at Fairchild Camera and Instrument, 207 00:09:24,070 --> 00:09:26,290 because Schlumberger had bought Fairchild. 208 00:09:26,290 --> 00:09:28,930 And that became known as FLAIR, the Fairchild 209 00:09:28,930 --> 00:09:30,950 Lab for AI Research. 210 00:09:30,950 --> 00:09:34,330 And although we're moving away from Cambridge, 211 00:09:34,330 --> 00:09:36,130 I'll tell you a little bit about that. 212 00:09:36,130 --> 00:09:38,680 That was-- again, it was a very exciting time for AI. 213 00:09:38,680 --> 00:09:40,060 This was the mid-'80s. 214 00:09:40,060 --> 00:09:41,470 Some of the new AI companies-- 215 00:09:41,470 --> 00:09:44,650 Technologen and Telecorp were beginning. 216 00:09:44,650 --> 00:09:47,170 Some money was coming in, not only from the government, 217 00:09:47,170 --> 00:09:49,000 but from the commercial sector. 218 00:09:49,000 --> 00:09:52,000 And there was just a great buzz about the field experts. 219 00:09:52,000 --> 00:09:55,210 Systems were getting big, and new labs were springing up all 220 00:09:55,210 --> 00:09:56,230 over the place. 221 00:09:56,230 --> 00:09:58,450 So I got recruited to go work with-- 222 00:09:58,450 --> 00:10:02,560 there were four prominent people who started this lab from SRI-- 223 00:10:02,560 --> 00:10:05,650 Peter Hart, who used to be the head of the AI Center, Marty 224 00:10:05,650 --> 00:10:09,430 Tenenbaum, Dick Duda and Harry Barrow-- 225 00:10:09,430 --> 00:10:11,200 wonderful people who had a vision 226 00:10:11,200 --> 00:10:14,080 they picked up out of SRI and started a brand new lab. 227 00:10:14,080 --> 00:10:15,520 And we had a blast there. 228 00:10:15,520 --> 00:10:17,740 I was there for another four years, 229 00:10:17,740 --> 00:10:21,370 working very closely with Hector Levesque, and also Phil Cohen. 230 00:10:21,370 --> 00:10:23,440 We got to branch out to work with colleagues 231 00:10:23,440 --> 00:10:25,410 across the street at Xerox PARC. 232 00:10:25,410 --> 00:10:28,240 It was a very exciting time at Xerox PARC. 233 00:10:28,240 --> 00:10:30,160 Brian Smith ended up there. 234 00:10:30,160 --> 00:10:33,250 Richard Fikes and Danny Bobrow were there. 235 00:10:33,250 --> 00:10:36,070 And we ended up doing more knowledge representation work. 236 00:10:36,070 --> 00:10:38,170 And with Hector and Richard, I ended up 237 00:10:38,170 --> 00:10:41,020 building a system that was called Krypton. 238 00:10:41,020 --> 00:10:43,630 We called it that, by the way, because the atomic symbol 239 00:10:43,630 --> 00:10:48,550 for Krypton is Kr, and knowledge representation is KR. 240 00:10:48,550 --> 00:10:50,740 And then, we did a very tiny version of it 241 00:10:50,740 --> 00:10:54,370 to see how fast we could make it run and that system we called, 242 00:10:54,370 --> 00:10:57,610 Kandor, because we mixed our metaphors. 243 00:10:57,610 --> 00:11:01,330 You may remember that Krypton is where Superman originated. 244 00:11:01,330 --> 00:11:03,430 And in the old Superman comic books, 245 00:11:03,430 --> 00:11:06,430 there was this city called, Kandor, that Brainiac 246 00:11:06,430 --> 00:11:08,410 shrunk and put in a bottle. 247 00:11:08,410 --> 00:11:11,500 So this was a little miniature version of Krypton, 248 00:11:11,500 --> 00:11:13,480 although we were mixing this metaphors. 249 00:11:13,480 --> 00:11:16,870 Nevertheless, over that period, we did an awful lot of work 250 00:11:16,870 --> 00:11:20,470 on these semantically well-founded, frame-based 251 00:11:20,470 --> 00:11:23,380 knowledge representation systems that we were starting 252 00:11:23,380 --> 00:11:24,670 to use in applications. 253 00:11:24,670 --> 00:11:27,812 And I'm very happy to say that they ended up 254 00:11:27,812 --> 00:11:29,770 being relatively influential with the knowledge 255 00:11:29,770 --> 00:11:31,690 representation community. 256 00:11:31,690 --> 00:11:35,680 So up until the middle of the '80s, 257 00:11:35,680 --> 00:11:39,670 I was in this very exciting milieu in California, 258 00:11:39,670 --> 00:11:42,320 in Palo Alto, with great stuff happening everywhere, 259 00:11:42,320 --> 00:11:43,780 including at Stanford. 260 00:11:43,780 --> 00:11:48,260 And then, after that set of events, 261 00:11:48,260 --> 00:11:51,100 I got recruited to come back to the East Coast, 262 00:11:51,100 --> 00:11:54,820 because Bell Labs was interested in building an AI group. 263 00:11:54,820 --> 00:11:58,090 And Bell Labs-- again, a phenomenal scientific research 264 00:11:58,090 --> 00:11:59,290 institution-- 265 00:11:59,290 --> 00:12:03,190 up until the early '80s, was expressly not interested in AI. 266 00:12:03,190 --> 00:12:06,010 In fact, there were many strong detractors 267 00:12:06,010 --> 00:12:08,200 who were working at Bell Labs, who 268 00:12:08,200 --> 00:12:11,410 thought that AI was completely full of hype 269 00:12:11,410 --> 00:12:13,510 and was never going to achieve its vision, 270 00:12:13,510 --> 00:12:16,120 and that it was something to be avoided at all costs. 271 00:12:16,120 --> 00:12:19,060 And finally, as AI was making its mark 272 00:12:19,060 --> 00:12:21,610 in the commercial world in the '80s, 273 00:12:21,610 --> 00:12:24,430 they woke up and decided that it was really probably worth 274 00:12:24,430 --> 00:12:27,337 pursuing inside a large company like AT&T. 275 00:12:27,337 --> 00:12:29,920 And in fact, there had been some other interesting work there. 276 00:12:29,920 --> 00:12:33,580 One early prominent expert system called, ACE, 277 00:12:33,580 --> 00:12:36,340 for automated cable expertise dealing 278 00:12:36,340 --> 00:12:38,980 with outside plant cables for the phone company, 279 00:12:38,980 --> 00:12:40,750 came into being. 280 00:12:40,750 --> 00:12:43,810 That was done by a guy named Greg Vesonder, whom I ended up 281 00:12:43,810 --> 00:12:46,350 working with when I came to Bell Labs for many, many years. 282 00:12:46,350 --> 00:12:48,010 We had a great collaboration. 283 00:12:48,010 --> 00:12:50,480 So that started the next phase of my career, 284 00:12:50,480 --> 00:12:52,810 which is a very, very substantial, long amount 285 00:12:52,810 --> 00:12:54,460 of time at AT&T-- 286 00:12:54,460 --> 00:12:58,540 Bell Labs and AT&T labs, which I can tell you about. 287 00:12:58,540 --> 00:13:01,960 But I might as well. 288 00:13:01,960 --> 00:13:04,660 There, again, we were able to do some very significant 289 00:13:04,660 --> 00:13:06,970 new knowledge representation work. 290 00:13:06,970 --> 00:13:09,400 And at the same time, for me, personally, this 291 00:13:09,400 --> 00:13:10,870 was an opportunity to start to get 292 00:13:10,870 --> 00:13:13,965 into management and leadership and grow a group. 293 00:13:13,965 --> 00:13:15,340 So this is the point in my career 294 00:13:15,340 --> 00:13:18,190 where I started to gently move from being purely 295 00:13:18,190 --> 00:13:20,710 a technical person in publishing and doing research, 296 00:13:20,710 --> 00:13:23,820 to a combination of that and managing, 297 00:13:23,820 --> 00:13:27,300 recruiting, and building large projects, which for me happened 298 00:13:27,300 --> 00:13:28,710 to be a lot of fun. 299 00:13:28,710 --> 00:13:31,230 And we then built another knowledge representation system 300 00:13:31,230 --> 00:13:36,810 called, CLASSIC, which, again, got a pretty wide amount 301 00:13:36,810 --> 00:13:37,560 of recognition. 302 00:13:37,560 --> 00:13:40,110 There were some really, really interesting things about it. 303 00:13:40,110 --> 00:13:43,380 And this was during an era that, I guess, 304 00:13:43,380 --> 00:13:47,700 you could say that Hector and I started in, around, 1984, 305 00:13:47,700 --> 00:13:51,990 when we published an AAAI paper on an extremely interesting 306 00:13:51,990 --> 00:13:54,090 technical fact that we uncovered, 307 00:13:54,090 --> 00:13:59,580 which involved a very striking trade-off in knowledge 308 00:13:59,580 --> 00:14:02,430 representation and reasoning between the expressive 309 00:14:02,430 --> 00:14:05,370 capabilities of a language-- how many different kinds of things 310 00:14:05,370 --> 00:14:06,390 can you say in it-- 311 00:14:06,390 --> 00:14:08,970 and the computational complexity of computing 312 00:14:08,970 --> 00:14:10,297 with that language. 313 00:14:10,297 --> 00:14:12,630 And this is something you see in other parts of computer 314 00:14:12,630 --> 00:14:14,850 science, but it had not at that point 315 00:14:14,850 --> 00:14:16,920 been applied to knowledge representation. 316 00:14:16,920 --> 00:14:20,670 And we ended up showing how two very simple languages-- 317 00:14:20,670 --> 00:14:22,590 and they were really simple-- 318 00:14:22,590 --> 00:14:24,720 differed in one operator. 319 00:14:24,720 --> 00:14:26,460 And in one case, you could prove that you 320 00:14:26,460 --> 00:14:29,952 could get a polynomial time computation, and in the other-- 321 00:14:29,952 --> 00:14:32,160 and it just looked like a tiny little difference, one 322 00:14:32,160 --> 00:14:36,000 to the other-- you ended up in an NP-hard or NP-complete 323 00:14:36,000 --> 00:14:36,760 situation. 324 00:14:36,760 --> 00:14:39,390 And so what we called-- it was the computational cliff. 325 00:14:39,390 --> 00:14:42,310 Basically, you add this one little syntactic operator, 326 00:14:42,310 --> 00:14:45,390 which you couldn't really tell by looking made any difference, 327 00:14:45,390 --> 00:14:46,550 and off you went. 328 00:14:46,550 --> 00:14:48,925 And in fact, it radically changed the worst-case behavior 329 00:14:48,925 --> 00:14:50,670 of the system. 330 00:14:50,670 --> 00:14:54,540 That actually led to an explosion of papers 331 00:14:54,540 --> 00:14:58,140 analyzing these simple, frame-based systems, 332 00:14:58,140 --> 00:15:00,840 adding all kinds of interesting operators and subtracting them, 333 00:15:00,840 --> 00:15:04,980 and looking at the competition complexity consequences. 334 00:15:04,980 --> 00:15:07,530 Literally, hundreds of papers subsequently 335 00:15:07,530 --> 00:15:09,990 followed on that, which was really quite interesting. 336 00:15:09,990 --> 00:15:13,290 And the kind of language that CLASSIC was, which really, 337 00:15:13,290 --> 00:15:16,740 very much followed this KL-ONE tradition-- 338 00:15:16,740 --> 00:15:19,500 at one point, it occurred to me that the essence 339 00:15:19,500 --> 00:15:22,170 of what we're trying to do is represent descriptions-- 340 00:15:22,170 --> 00:15:24,858 usually, of objects in the world-- sometimes, actions. 341 00:15:24,858 --> 00:15:26,400 And the descriptions were structured, 342 00:15:26,400 --> 00:15:28,590 and they could be small or complex. 343 00:15:28,590 --> 00:15:32,070 And we ended up calling this class of languages, description 344 00:15:32,070 --> 00:15:33,090 logics. 345 00:15:33,090 --> 00:15:35,910 And that, in some ways, grew a whole small niche 346 00:15:35,910 --> 00:15:38,280 area of AI, in which there's still 347 00:15:38,280 --> 00:15:40,020 an awful lot of ongoing work. 348 00:15:40,020 --> 00:15:42,660 And ultimately, that track has had a pretty substantial 349 00:15:42,660 --> 00:15:45,090 influence, for example, on the development 350 00:15:45,090 --> 00:15:49,980 of languages for the semantic web, which is very gratifying. 351 00:15:49,980 --> 00:15:52,680 So to complete the story there for you, 352 00:15:52,680 --> 00:15:56,340 I spent more than 16 years at AT&T, 353 00:15:56,340 --> 00:15:59,520 ended up getting promoted to the level of a lab director-- 354 00:15:59,520 --> 00:16:04,170 so I ended up running a lab of 80 to 100 people at various 355 00:16:04,170 --> 00:16:05,190 times-- 356 00:16:05,190 --> 00:16:08,160 gave up my AI department to other people-- 357 00:16:08,160 --> 00:16:12,180 Henry Kautz, Michael Kearns, and really great people like that-- 358 00:16:12,180 --> 00:16:14,730 and grew the influence in the company, 359 00:16:14,730 --> 00:16:17,250 in terms of what we were doing-- the breadth-- 360 00:16:17,250 --> 00:16:20,610 as well as learned a lot about managing larger groups. 361 00:16:20,610 --> 00:16:22,350 Well, yeah, sure. 362 00:16:22,350 --> 00:16:26,640 We ended up having just an absolutely extraordinary lab 363 00:16:26,640 --> 00:16:30,100 by the time I left AT&T in 2002. 364 00:16:30,100 --> 00:16:33,810 And that lab had a tremendous AI component to it. 365 00:16:33,810 --> 00:16:36,450 At the beginning, I have to admit, most of us 366 00:16:36,450 --> 00:16:39,360 were not particularly interested in or very confident 367 00:16:39,360 --> 00:16:41,370 in the effect of computer learning. 368 00:16:41,370 --> 00:16:43,440 We were largely focused on reasoning 369 00:16:43,440 --> 00:16:46,258 of various sorts-- information retrieval, web interfaces, 370 00:16:46,258 --> 00:16:46,800 and the like. 371 00:16:46,800 --> 00:16:49,530 But we were, kind of, cynical about learning. 372 00:16:49,530 --> 00:16:51,960 But we finally found a really excellent person-- 373 00:16:51,960 --> 00:16:54,570 in this case, William Cohen from Rutgers, 374 00:16:54,570 --> 00:16:57,060 who is now at Carnegie Mellon-- 375 00:16:57,060 --> 00:16:58,710 who was a terrific learning guy, who 376 00:16:58,710 --> 00:17:01,290 did both the practical and the theoretical side. 377 00:17:01,290 --> 00:17:04,470 And we hired him, and that started a cascade of hiring, 378 00:17:04,470 --> 00:17:06,510 where we ended up with-- 379 00:17:06,510 --> 00:17:09,480 I think some people would say-- probably, the world's best 380 00:17:09,480 --> 00:17:11,460 machine learning group for quite a few years. 381 00:17:11,460 --> 00:17:13,839 And I can tick off the names of all the great people who 382 00:17:13,839 --> 00:17:14,339 were there. 383 00:17:14,339 --> 00:17:16,510 Many of them are extremely well-known. 384 00:17:16,510 --> 00:17:20,670 Unfortunately, in 2001, AT&T decided 385 00:17:20,670 --> 00:17:23,069 to change its mind about the size of the budget it wanted 386 00:17:23,069 --> 00:17:25,200 to use for research in general. 387 00:17:25,200 --> 00:17:27,270 And to make a long story short, we 388 00:17:27,270 --> 00:17:30,540 ended up having to have a lot of people leave. 389 00:17:30,540 --> 00:17:33,990 And I left and took a retirement package, 390 00:17:33,990 --> 00:17:38,310 and ended up, in a sense, coming full circle, working at DARPA. 391 00:17:38,310 --> 00:17:41,250 And in fact, I got Bob Kahn's old job 392 00:17:41,250 --> 00:17:43,620 as the director of the Information Processing 393 00:17:43,620 --> 00:17:45,360 Technology Office. 394 00:17:45,360 --> 00:17:49,763 And in fact, I ended up having Bob be my boss, 395 00:17:49,763 --> 00:17:51,180 because I technically went to work 396 00:17:51,180 --> 00:17:53,940 for his company, the Corporation for National Research 397 00:17:53,940 --> 00:17:57,460 Initiative, and then I did a rotation from his company 398 00:17:57,460 --> 00:18:00,780 into DARPA for several years as the director of the Office. 399 00:18:00,780 --> 00:18:05,190 And that was just a huge thrill and an incredible opportunity, 400 00:18:05,190 --> 00:18:07,770 because the director of the agency at this point 401 00:18:07,770 --> 00:18:10,470 was extremely interested in how to get computers 402 00:18:10,470 --> 00:18:12,320 to learn and, overall, to have them 403 00:18:12,320 --> 00:18:15,150 be more intelligent, in support, in this case, of the Department 404 00:18:15,150 --> 00:18:16,200 of Defense. 405 00:18:16,200 --> 00:18:20,190 So I got to help put in place a pretty wide variety of very 406 00:18:20,190 --> 00:18:24,580 significant programs and to help spend a lot of money 407 00:18:24,580 --> 00:18:28,050 on AI research, which was a great deal of fun-- 408 00:18:28,050 --> 00:18:29,010 great deal of fun. 409 00:18:29,010 --> 00:18:31,650 And so I finished my stint at DARPA, working for Bob, 410 00:18:31,650 --> 00:18:33,870 and ended up taking a really interesting job 411 00:18:33,870 --> 00:18:37,770 now at Yahoo, which is developing a new research lab. 412 00:18:37,770 --> 00:18:40,200 So that's my story. 413 00:18:40,200 --> 00:18:44,646 So I guess we're getting ready to wrap up. 414 00:18:44,646 --> 00:18:47,040 I'm gonna roll way back to Tech Square again. 415 00:18:47,040 --> 00:18:48,177 Please do. 416 00:18:48,177 --> 00:18:50,260 I spend a lot of time in the original Tech Square, 417 00:18:50,260 --> 00:18:51,218 up on the eighth floor. 418 00:18:51,218 --> 00:18:51,930 OK, you do? 419 00:18:51,930 --> 00:18:56,010 So I guess, what were some of the cool things going on 420 00:18:56,010 --> 00:18:58,110 that you specifically remember? 421 00:18:58,110 --> 00:19:01,080 What were you really interested in when you stopped by? 422 00:19:01,080 --> 00:19:04,800 Well, it's going to be a little tough to remember them 423 00:19:04,800 --> 00:19:06,070 in detail at this point. 424 00:19:06,070 --> 00:19:09,030 But I do remember being very impressed. 425 00:19:09,030 --> 00:19:11,490 At this point, people were doing a lot of talk 426 00:19:11,490 --> 00:19:16,170 about the blocks world, using it for all kinds of AI challenge 427 00:19:16,170 --> 00:19:16,980 problems. 428 00:19:16,980 --> 00:19:18,730 And I think it was up on the ninth floor-- 429 00:19:18,730 --> 00:19:22,020 I finally got to see, there was a robot arm that Winograd 430 00:19:22,020 --> 00:19:24,420 and others had used, where, if you remember, 431 00:19:24,420 --> 00:19:26,100 Terry's thesis allowed you to speak 432 00:19:26,100 --> 00:19:29,100 in fairly free-form natural language within the blocks 433 00:19:29,100 --> 00:19:29,700 world context. 434 00:19:29,700 --> 00:19:32,250 You know, find a big red block, and move it 435 00:19:32,250 --> 00:19:35,010 to next to a green pyramid, and things like that. 436 00:19:35,010 --> 00:19:37,650 And you could actually go up and see the robot arm taking 437 00:19:37,650 --> 00:19:38,700 and moving things around. 438 00:19:38,700 --> 00:19:41,910 And to be honest, even though robots were in the literature 439 00:19:41,910 --> 00:19:47,370 and on TV and movies at that point, to see one at the MIT AI 440 00:19:47,370 --> 00:19:48,720 lab-- 441 00:19:48,720 --> 00:19:53,250 to see the AI work in the universe with people 442 00:19:53,250 --> 00:19:56,970 like Marvin, Seymour, Winograd, Carl Hewitt, and Gerry Sussman 443 00:19:56,970 --> 00:19:58,270 all floating around-- 444 00:19:58,270 --> 00:20:00,690 that was very exciting. 445 00:20:00,690 --> 00:20:02,700 Another thing that was really quite interesting 446 00:20:02,700 --> 00:20:05,700 was, I met quite a few other people, 447 00:20:05,700 --> 00:20:08,190 and there were lots of fun discussions in the center 448 00:20:08,190 --> 00:20:11,100 hall of the eighth floor there, and seminars, 449 00:20:11,100 --> 00:20:16,170 and lots of arguments, and a lot of Chomsky bashing at the time. 450 00:20:16,170 --> 00:20:20,407 People were discovering agents, and a lot 451 00:20:20,407 --> 00:20:22,740 of really exciting work was going on in natural language 452 00:20:22,740 --> 00:20:24,198 processing at the time and what was 453 00:20:24,198 --> 00:20:26,120 called, procedural semantics. 454 00:20:26,120 --> 00:20:28,110 And I think it's hard to remember 455 00:20:28,110 --> 00:20:30,150 too many specific anecdotes, but I 456 00:20:30,150 --> 00:20:36,090 remember, the people [INAUDIBLE] the classes and lectures. 457 00:20:36,090 --> 00:20:38,610 I do remember one thing quite vividly from the Intro 458 00:20:38,610 --> 00:20:39,510 to AI course. 459 00:20:39,510 --> 00:20:43,830 It was given in a large auditorium. 460 00:20:43,830 --> 00:20:46,410 Seymour got up and did a balancing act 461 00:20:46,410 --> 00:20:50,012 while juggling on the kind of a cylinder 462 00:20:50,012 --> 00:20:50,970 where you put a board-- 463 00:20:50,970 --> 00:20:51,803 [INTERPOSING VOICES] 464 00:20:51,803 --> 00:20:52,380 Yeah. 465 00:20:52,380 --> 00:20:56,040 I can't remember why he did that. 466 00:20:56,040 --> 00:20:58,260 But I certainly remember that quite vividly. 467 00:20:58,260 --> 00:21:02,040 And I also have to say, I remember the style of lectures 468 00:21:02,040 --> 00:21:03,510 that Marvin would give. 469 00:21:03,510 --> 00:21:06,540 One other quick anecdote-- another very important event 470 00:21:06,540 --> 00:21:10,830 in my AI career was in 1977, which 471 00:21:10,830 --> 00:21:14,790 was exactly when I finished my thesis and graduated. 472 00:21:14,790 --> 00:21:17,550 The IJCAI conference, the large international AI conference, 473 00:21:17,550 --> 00:21:19,350 was held in Cambridge. 474 00:21:19,350 --> 00:21:21,710 Big lectures were held Kresge Auditorium, 475 00:21:21,710 --> 00:21:24,010 and the talks were given in the classroom. 476 00:21:24,010 --> 00:21:25,890 So it was very exciting to be living there 477 00:21:25,890 --> 00:21:29,680 and be there as the whole world came for the AI conference. 478 00:21:29,680 --> 00:21:33,160 And I remember Marvin giving an invited lecture. 479 00:21:33,160 --> 00:21:37,140 And basically, this was well before PowerPoint, right? 480 00:21:37,140 --> 00:21:39,070 He got up there with the transparency 481 00:21:39,070 --> 00:21:40,990 and started doodling, and basically, 482 00:21:40,990 --> 00:21:44,380 gave his talk, doodling on transparency, 483 00:21:44,380 --> 00:21:46,615 and ended up writing right on the projector 484 00:21:46,615 --> 00:21:48,208 and didn't even realize it. 485 00:21:48,208 --> 00:21:49,000 He still does that. 486 00:21:49,000 --> 00:21:49,792 He still does that. 487 00:21:49,792 --> 00:21:50,590 Right. 488 00:21:50,590 --> 00:21:52,570 But it was great fun-- very, very memorable. 489 00:21:52,570 --> 00:21:54,370 I also remember at that point-- 490 00:21:54,370 --> 00:21:56,680 Doug [INAUDIBLE],, who was very well-known in AI 491 00:21:56,680 --> 00:21:58,660 gave the computers and thought lecture. 492 00:21:58,660 --> 00:22:01,310 And that was a great piece of showmanship as well. 493 00:22:01,310 --> 00:22:03,160 So that was very exciting to be there, 494 00:22:03,160 --> 00:22:05,870 having finished my thesis exactly at that time. 495 00:22:05,870 --> 00:22:09,250 It was a major, major conference and a big happening 496 00:22:09,250 --> 00:22:09,910 in the States. 497 00:22:09,910 --> 00:22:14,320 And it was a great place to be at that particular point 498 00:22:14,320 --> 00:22:15,130 in time. 499 00:22:15,130 --> 00:22:15,940 [INAUDIBLE] 500 00:22:15,940 --> 00:22:17,278 Yeah. 501 00:22:17,278 --> 00:22:18,070 Thanks for talking. 502 00:22:18,070 --> 00:22:19,362 I wish we could talk some more. 503 00:22:19,362 --> 00:22:20,200 My pleasure. 504 00:22:20,200 --> 00:22:22,283 And just so you know, I don't know if you realize, 505 00:22:22,283 --> 00:22:24,590 but we have videotapes of the AOF arm moving the blocks 506 00:22:24,590 --> 00:22:25,817 and [INAUDIBLE] system. 507 00:22:25,817 --> 00:22:27,400 That's part of what we're developing-- 508 00:22:27,400 --> 00:22:29,430 Yeah, I would hope.