1 00:00:00,000 --> 00:00:03,330 2 00:00:03,330 --> 00:00:04,830 Do you mind introducing yourself? 3 00:00:04,830 --> 00:00:07,410 I'm Jon Doyle. 4 00:00:07,410 --> 00:00:08,910 What sort of introduction? 5 00:00:08,910 --> 00:00:12,930 So what are you doing right now? 6 00:00:12,930 --> 00:00:17,290 I'm pursuing both how to represent 7 00:00:17,290 --> 00:00:20,440 decision [INAUDIBLE] preference information, 8 00:00:20,440 --> 00:00:22,810 which economists typically have not 9 00:00:22,810 --> 00:00:25,450 had good representations for, but we'll 10 00:00:25,450 --> 00:00:27,520 need good representations if you want 11 00:00:27,520 --> 00:00:31,370 to have proxies or intelligent agents that do what you want. 12 00:00:31,370 --> 00:00:35,630 And I am also working on mechanics, 13 00:00:35,630 --> 00:00:39,930 and I just published a book that extends 14 00:00:39,930 --> 00:00:44,490 the modern axioms of mechanics to cover minds as well 15 00:00:44,490 --> 00:00:45,290 as bodies. 16 00:00:45,290 --> 00:00:48,130 17 00:00:48,130 --> 00:00:50,850 If you want to think back to the time of DesCartes, 18 00:00:50,850 --> 00:00:52,440 where there was both the physical body 19 00:00:52,440 --> 00:00:54,120 and the mental substance. 20 00:00:54,120 --> 00:00:56,400 You now can have a theory in which both parts of this 21 00:00:56,400 --> 00:00:59,380 are satisfying the same set of mechanical laws. 22 00:00:59,380 --> 00:01:01,040 So that's the other part of my-- 23 00:01:01,040 --> 00:01:03,010 Well, we'll get back to that. 24 00:01:03,010 --> 00:01:05,810 How did you first get into artificial intelligence? 25 00:01:05,810 --> 00:01:09,580 26 00:01:09,580 --> 00:01:13,120 Well, the proximate reason was I had a teacher at University 27 00:01:13,120 --> 00:01:19,920 of Houston who thought that understanding the mind 28 00:01:19,920 --> 00:01:24,580 was going to be the next big thing in the next 20 years. 29 00:01:24,580 --> 00:01:29,940 And that got me very interested in it. 30 00:01:29,940 --> 00:01:31,500 I went into the field. 31 00:01:31,500 --> 00:01:35,310 Cool, so what other kind of research was-- when was that? 32 00:01:35,310 --> 00:01:42,600 That was '72 or '73, somewhere around that. 33 00:01:42,600 --> 00:01:45,580 So where did you go from there? 34 00:01:45,580 --> 00:01:49,040 Well, I went to grad school at MIT, 35 00:01:49,040 --> 00:01:54,960 and conceivably, I'm the first person 36 00:01:54,960 --> 00:01:58,260 to actually get a degree with specification. 37 00:01:58,260 --> 00:02:00,120 My PhD is in artificial intelligence. 38 00:02:00,120 --> 00:02:00,872 Wow. 39 00:02:00,872 --> 00:02:02,040 Congratulations. 40 00:02:02,040 --> 00:02:02,940 They offered it. 41 00:02:02,940 --> 00:02:06,630 I don't know-- maybe somebody must have had a thesis in that 42 00:02:06,630 --> 00:02:12,960 before, but I picked it, and I work with Jerry Sussman, who 43 00:02:12,960 --> 00:02:13,980 was my advisor. 44 00:02:13,980 --> 00:02:19,080 And after that, I worked at Stanford University for a year 45 00:02:19,080 --> 00:02:23,880 with John McCarthy, and I was at CMU for seven or so years. 46 00:02:23,880 --> 00:02:24,760 All the big places. 47 00:02:24,760 --> 00:02:28,050 And I'm back at MIT for another 13 or so. 48 00:02:28,050 --> 00:02:30,060 And now I'm at North Carolina State University. 49 00:02:30,060 --> 00:02:30,560 Great. 50 00:02:30,560 --> 00:02:31,000 Great. 51 00:02:31,000 --> 00:02:31,500 That's cool. 52 00:02:31,500 --> 00:02:34,740 So tell me about, since our project is originally 53 00:02:34,740 --> 00:02:37,635 about some film reels we found at Tech Square. 54 00:02:37,635 --> 00:02:39,510 Could you tell us a little bit about what was 55 00:02:39,510 --> 00:02:41,210 going on when you got there? 56 00:02:41,210 --> 00:02:45,480 And maybe you remember about Sussman? 57 00:02:45,480 --> 00:02:48,300 Well, I know various stories he told me, 58 00:02:48,300 --> 00:02:52,160 but I saw the thing on the list. 59 00:02:52,160 --> 00:02:56,900 The two best stories I know, that I can remember immediately 60 00:02:56,900 --> 00:03:01,180 from my grad school days, both actually 61 00:03:01,180 --> 00:03:03,020 involved Guy Steele, who was a student there 62 00:03:03,020 --> 00:03:04,390 at the same time as myself. 63 00:03:04,390 --> 00:03:10,883 And in one of them, near the-- 64 00:03:10,883 --> 00:03:12,050 I forget whether it was on-- 65 00:03:12,050 --> 00:03:14,175 I guess it was while he was working on his master's 66 00:03:14,175 --> 00:03:15,740 thesis, maybe his PhD. 67 00:03:15,740 --> 00:03:18,980 But he had a bug which depended upon the phase of the moon, 68 00:03:18,980 --> 00:03:23,490 literally, which was a joke up until that point. 69 00:03:23,490 --> 00:03:26,750 But [INAUDIBLE] at that point had 70 00:03:26,750 --> 00:03:29,870 a thing where you could print out the time and the date. 71 00:03:29,870 --> 00:03:32,090 In fact, you could also print the phase of the moon, 72 00:03:32,090 --> 00:03:35,330 and it turned out that, in his program, 73 00:03:35,330 --> 00:03:37,400 it would compile things into LISP. 74 00:03:37,400 --> 00:03:40,670 And it would have these comments up at the top 75 00:03:40,670 --> 00:03:43,040 saying when it was done, and then there 76 00:03:43,040 --> 00:03:45,590 was a sentence about when it was done, and then at the end, 77 00:03:45,590 --> 00:03:47,715 there would be a period at the end of the sentence. 78 00:03:47,715 --> 00:03:52,640 Well, it turns out there was word wrapping in the output 79 00:03:52,640 --> 00:03:55,490 routines, and if the face of the moon 80 00:03:55,490 --> 00:03:57,800 was just right, that period would wind up 81 00:03:57,800 --> 00:04:02,010 isolated on a [INAUDIBLE] bug. 82 00:04:02,010 --> 00:04:04,340 When this happened to him, and he realized what it was, 83 00:04:04,340 --> 00:04:06,020 he took a screenshot, and for years, he 84 00:04:06,020 --> 00:04:07,920 had that screenshot on his office door. 85 00:04:07,920 --> 00:04:10,580 86 00:04:10,580 --> 00:04:13,910 The other thing is that I had taken, I'd say, 87 00:04:13,910 --> 00:04:19,100 quantum mechanics before I came to MIT, and Guy, I think, 88 00:04:19,100 --> 00:04:20,810 was taking a quantum mechanics class. 89 00:04:20,810 --> 00:04:25,120 And there was this funny article-- 90 00:04:25,120 --> 00:04:28,030 maybe not so funny article in the paper, or a joke, 91 00:04:28,030 --> 00:04:33,220 at the time about an aircraft windshield manufacturer who 92 00:04:33,220 --> 00:04:36,350 had been testing windshields. 93 00:04:36,350 --> 00:04:39,220 They would shoot chickens at them at 800 miles an hour, 94 00:04:39,220 --> 00:04:41,020 and the joke was that somebody finally 95 00:04:41,020 --> 00:04:44,870 realized they didn't need to use live chickens to do it. 96 00:04:44,870 --> 00:04:48,550 But Guy decided to exercise his skills, 97 00:04:48,550 --> 00:04:54,950 and he computed the wavelength of an 800 mile an hour chicken 98 00:04:54,950 --> 00:04:59,110 and then put up, on his wall, frequencies, 99 00:04:59,110 --> 00:05:01,510 various things about the electromagnetic spectrum 100 00:05:01,510 --> 00:05:04,210 and then x-rays and gamma rays and then 800-- 101 00:05:04,210 --> 00:05:07,390 a big separation mark and then the frequency 102 00:05:07,390 --> 00:05:09,676 of an 800 mile an hour chicken. 103 00:05:09,676 --> 00:05:10,900 That's fabulous. 104 00:05:10,900 --> 00:05:13,390 Was that representative of the general culture at MIT-- 105 00:05:13,390 --> 00:05:15,670 People thought about all sorts of stuff 106 00:05:15,670 --> 00:05:17,660 and talked about all sorts of stuff. 107 00:05:17,660 --> 00:05:20,230 There was a gang of us, of varying composition, that 108 00:05:20,230 --> 00:05:25,150 would walk over to a sub shop near the labs, 109 00:05:25,150 --> 00:05:29,440 and there would always be some discussion. 110 00:05:29,440 --> 00:05:32,200 During my time, there were a lot of discussions of philosophy 111 00:05:32,200 --> 00:05:34,510 and logic and all sorts of things, 112 00:05:34,510 --> 00:05:37,390 and those were some of the most interesting discussions 113 00:05:37,390 --> 00:05:38,210 of the day. 114 00:05:38,210 --> 00:05:38,710 That's cool. 115 00:05:38,710 --> 00:05:41,190 Were you part of the Chinese Food Sect? 116 00:05:41,190 --> 00:05:43,840 I used to hear that was a very popular option. 117 00:05:43,840 --> 00:05:49,000 Yeah, [INAUDIBLE] small eating place was down the street, 118 00:05:49,000 --> 00:05:52,450 and they had very good food. 119 00:05:52,450 --> 00:05:56,560 They had a hot and sour wonton soup which was extremely spicy, 120 00:05:56,560 --> 00:05:59,500 and if you had a cold or something, 121 00:05:59,500 --> 00:06:01,540 it was always a good idea to go there, 122 00:06:01,540 --> 00:06:04,060 and your entire respiratory system 123 00:06:04,060 --> 00:06:06,670 would be cleaned out in short order 124 00:06:06,670 --> 00:06:08,980 by eating some hot and sour wonton soup. 125 00:06:08,980 --> 00:06:11,640 I'm wondering how similar it was to now. 126 00:06:11,640 --> 00:06:15,790 What were people's hours like at the lab? 127 00:06:15,790 --> 00:06:17,980 A lot of people worked late. 128 00:06:17,980 --> 00:06:19,850 I've always been a morning person, 129 00:06:19,850 --> 00:06:25,010 so I would, a lot of the time, get there 4:00 or 5:00 130 00:06:25,010 --> 00:06:27,860 in the morning, and I'd have-- 131 00:06:27,860 --> 00:06:30,140 some people would just be leaving at that point, 132 00:06:30,140 --> 00:06:31,880 and I would have a lot of time. 133 00:06:31,880 --> 00:06:33,435 And well, this was also-- 134 00:06:33,435 --> 00:06:36,060 I mean, I probably wouldn't have gotten there quite that early, 135 00:06:36,060 --> 00:06:39,080 except this was the time of shared mainframes. 136 00:06:39,080 --> 00:06:39,770 Oh, absolutely. 137 00:06:39,770 --> 00:06:42,710 And also, not every-- 138 00:06:42,710 --> 00:06:44,720 there was a terminal in each office or so, 139 00:06:44,720 --> 00:06:46,380 but not everybody-- 140 00:06:46,380 --> 00:06:47,750 So if you're much faster-- 141 00:06:47,750 --> 00:06:50,090 So if you wanted time to spend on a computer, 142 00:06:50,090 --> 00:06:52,687 either you had to stay late, or later than everybody else, 143 00:06:52,687 --> 00:06:54,770 for have to get there earlier than everybody else. 144 00:06:54,770 --> 00:06:55,580 Interesting. 145 00:06:55,580 --> 00:06:56,650 I got there earlier. 146 00:06:56,650 --> 00:06:59,023 147 00:06:59,023 --> 00:07:01,440 So what do you think some of the most challenging problems 148 00:07:01,440 --> 00:07:06,510 in artificial intelligence were and remain today in the field? 149 00:07:06,510 --> 00:07:08,790 Or what intrigues you? 150 00:07:08,790 --> 00:07:10,410 Maybe that's too general a question. 151 00:07:10,410 --> 00:07:14,500 Well, there's still-- we still don't 152 00:07:14,500 --> 00:07:19,960 know how to represent lots of information, 153 00:07:19,960 --> 00:07:25,310 and part of the reason is that everything 154 00:07:25,310 --> 00:07:26,900 that we know how to do, we basically 155 00:07:26,900 --> 00:07:29,360 have to filter through individual human minds. 156 00:07:29,360 --> 00:07:31,820 Somebody has to understand how to do that. 157 00:07:31,820 --> 00:07:34,850 And intelligence is so complicated, 158 00:07:34,850 --> 00:07:37,800 and the mind is so complicated, that it's-- 159 00:07:37,800 --> 00:07:40,130 we can only understand little bits at a time, 160 00:07:40,130 --> 00:07:44,780 and there's a finite number of people in the field. 161 00:07:44,780 --> 00:07:52,320 And it's not clear that you can get something of the complexity 162 00:07:52,320 --> 00:07:55,050 and size that would look like a regular person 163 00:07:55,050 --> 00:07:57,560 without spending a very long time on it. 164 00:07:57,560 --> 00:08:00,960 If I look at mathematics or some other fields 165 00:08:00,960 --> 00:08:04,140 and see how long it took to develop 166 00:08:04,140 --> 00:08:07,800 what are thought of as basic everyday tools today, 167 00:08:07,800 --> 00:08:11,560 it was hundreds of years. 168 00:08:11,560 --> 00:08:12,960 If you look at physics-- 169 00:08:12,960 --> 00:08:17,730 I mean, Newton's famous for these three postulates. 170 00:08:17,730 --> 00:08:20,610 F equals MA was not one of those. 171 00:08:20,610 --> 00:08:23,700 F equals MA is an equation published by Euler 172 00:08:23,700 --> 00:08:27,070 50 years later. 173 00:08:27,070 --> 00:08:28,470 There was a long period. 174 00:08:28,470 --> 00:08:29,940 Now, there weren't very many people 175 00:08:29,940 --> 00:08:31,357 working on this compared to today, 176 00:08:31,357 --> 00:08:36,570 but there was a long period to get from initial ideas of how 177 00:08:36,570 --> 00:08:40,110 something would work to the sort of things 178 00:08:40,110 --> 00:08:42,900 you could use every day to calculate more things 179 00:08:42,900 --> 00:08:43,659 or to do stuff. 180 00:08:43,659 --> 00:08:45,960 And I think we're in that same sort of position in AI, 181 00:08:45,960 --> 00:08:48,720 that we have all sorts of good ideas, 182 00:08:48,720 --> 00:08:52,710 and it takes a long time to set the table, 183 00:08:52,710 --> 00:08:56,100 if you want to think of it that way, for all the tools you need 184 00:08:56,100 --> 00:09:00,280 and collecting all the information and trying it out. 185 00:09:00,280 --> 00:09:05,225 And since it's a long time, invariably, somewhere, you're 186 00:09:05,225 --> 00:09:07,350 part of the way through, and you have a better idea 187 00:09:07,350 --> 00:09:09,073 about some part, and so you don't quite 188 00:09:09,073 --> 00:09:11,490 finish what you were doing, and you try to keep adjusting. 189 00:09:11,490 --> 00:09:13,815 And the time keeps moving off a little bit. 190 00:09:13,815 --> 00:09:15,690 I mean, there's a lot of progress being made, 191 00:09:15,690 --> 00:09:20,260 but it's a very difficult plotline. 192 00:09:20,260 --> 00:09:24,270 So if they were a subarea or paradigm 193 00:09:24,270 --> 00:09:26,910 of artificial intelligence in the future that 194 00:09:26,910 --> 00:09:29,880 was developed and had a law named after you, 195 00:09:29,880 --> 00:09:31,830 what area would you like that to be? 196 00:09:31,830 --> 00:09:32,410 Well, aw, no. 197 00:09:32,410 --> 00:09:34,730 I don't know anything about that. 198 00:09:34,730 --> 00:09:37,380 What area do you find most fascinating 199 00:09:37,380 --> 00:09:41,050 within artificial intelligence? 200 00:09:41,050 --> 00:09:44,170 I don't know how-- well, I-- 201 00:09:44,170 --> 00:09:46,360 What do you think the biggest potentials are? 202 00:09:46,360 --> 00:09:49,840 Well, one of my favorite activities 203 00:09:49,840 --> 00:09:56,770 is trying to formulate the ideas that we don't understand well 204 00:09:56,770 --> 00:10:01,150 yet in mathematical terms, not because the formalization 205 00:10:01,150 --> 00:10:02,710 always gives you the-- 206 00:10:02,710 --> 00:10:04,600 that you can always succeed, but just 207 00:10:04,600 --> 00:10:08,050 making that attempt raises a lot of problems 208 00:10:08,050 --> 00:10:10,960 that you hadn't quite put your finger on before. 209 00:10:10,960 --> 00:10:14,920 And I think one of the exciting things about the field 210 00:10:14,920 --> 00:10:17,020 is that we have figured out how to formalize 211 00:10:17,020 --> 00:10:19,060 various things that nobody has figured out 212 00:10:19,060 --> 00:10:22,870 how to say precisely before, and I 213 00:10:22,870 --> 00:10:26,785 think that's an exciting thing for me. 214 00:10:26,785 --> 00:10:29,160 It's not to everybody's taste, but that's exciting to me, 215 00:10:29,160 --> 00:10:30,780 and that's going to continue for a long while. 216 00:10:30,780 --> 00:10:33,120 I think if we're going to make really big progress, 217 00:10:33,120 --> 00:10:35,810 there has to be a lot of additional work on formalizing 218 00:10:35,810 --> 00:10:36,310 that. 219 00:10:36,310 --> 00:10:38,240 Can you give us some examples of things that 220 00:10:38,240 --> 00:10:39,900 have been formalized already? 221 00:10:39,900 --> 00:10:45,060 Well, the one that I've been involved with 222 00:10:45,060 --> 00:10:49,320 is non-monotonic logics, which is 223 00:10:49,320 --> 00:10:52,350 a real contribution to logic, as far as I'm concerned. 224 00:10:52,350 --> 00:10:55,050 And it's the first theory anybody 225 00:10:55,050 --> 00:10:59,730 had about how to talk about assumptions that you might want 226 00:10:59,730 --> 00:11:02,160 to make and rules for making assumptions 227 00:11:02,160 --> 00:11:04,950 when you want to have them fit well 228 00:11:04,950 --> 00:11:08,430 or cohere with other information that you have. 229 00:11:08,430 --> 00:11:10,770 It's not probabilistic. 230 00:11:10,770 --> 00:11:12,660 There are probabilistic theories of it, 231 00:11:12,660 --> 00:11:14,190 but in fact, I think those are not 232 00:11:14,190 --> 00:11:16,560 the right way of thinking about it. 233 00:11:16,560 --> 00:11:21,270 It's a theory that's combining both preferences about what you 234 00:11:21,270 --> 00:11:24,660 think you'd rather conclude in a state of ignorance 235 00:11:24,660 --> 00:11:28,027 with some sort of inference about, well, what do I know 236 00:11:28,027 --> 00:11:29,610 and how does that constrain what I can 237 00:11:29,610 --> 00:11:32,470 do to fulfill my preferences? 238 00:11:32,470 --> 00:11:34,950 So it's a quite interesting logic. 239 00:11:34,950 --> 00:11:37,920 It's the basis for a huge literature right now 240 00:11:37,920 --> 00:11:42,090 on logic programming. 241 00:11:42,090 --> 00:11:48,030 It still has not worked its way through most AI representations 242 00:11:48,030 --> 00:11:51,300 or mechanisms partly because of some complexity issues 243 00:11:51,300 --> 00:11:55,830 that we're still working out, but it's a new idea, 244 00:11:55,830 --> 00:11:59,960 as far as new formalized idea, and it didn't exist before. 245 00:11:59,960 --> 00:12:03,220 246 00:12:03,220 --> 00:12:05,800 I think we're going to have more of those, 247 00:12:05,800 --> 00:12:07,505 but we don't really have them. 248 00:12:07,505 --> 00:12:08,880 The work I'm doing on preferences 249 00:12:08,880 --> 00:12:11,050 right now is just trying to say, well, 250 00:12:11,050 --> 00:12:14,015 if you talk about generic preference information, what 251 00:12:14,015 --> 00:12:14,640 does that mean? 252 00:12:14,640 --> 00:12:21,003 If I say I like red cars better than blue cars, to economists, 253 00:12:21,003 --> 00:12:23,170 that doesn't necessarily make any sense because they 254 00:12:23,170 --> 00:12:26,500 want to talk about decision preferences 255 00:12:26,500 --> 00:12:29,980 that order the specific alternatives you had, 256 00:12:29,980 --> 00:12:32,050 not properties of those. 257 00:12:32,050 --> 00:12:35,020 And there are a lot of possible interpretations for what 258 00:12:35,020 --> 00:12:38,050 it could mean to say, I like red cars better than blue cars, 259 00:12:38,050 --> 00:12:41,530 and we still are just exploring the first couple 260 00:12:41,530 --> 00:12:42,410 simplest of those. 261 00:12:42,410 --> 00:12:44,800 And we have no idea whether, when 262 00:12:44,800 --> 00:12:47,740 you or I talk about what we like or dislike, 263 00:12:47,740 --> 00:12:50,200 what's involved in the meaning of this. 264 00:12:50,200 --> 00:12:52,150 So have you been-- 265 00:12:52,150 --> 00:12:54,740 how do you feel about the development of the field? 266 00:12:54,740 --> 00:12:57,500 What have been some of the biggest 267 00:12:57,500 --> 00:13:00,570 disappointments, surprises, what you're particularly pleased 268 00:13:00,570 --> 00:13:01,070 about? 269 00:13:01,070 --> 00:13:03,620 The most surprising thing, I guess, to me, 270 00:13:03,620 --> 00:13:07,940 is that there's been an enormous amount of progress 271 00:13:07,940 --> 00:13:11,670 both in the computers that underlie what we can do 272 00:13:11,670 --> 00:13:16,010 and in what people have actually done with it. 273 00:13:16,010 --> 00:13:17,660 What's really surprising about that 274 00:13:17,660 --> 00:13:20,210 isn't that all this progress has happened? 275 00:13:20,210 --> 00:13:23,870 It's that almost all of it's invisible to most of the world, 276 00:13:23,870 --> 00:13:27,440 because right now, huge sections of society 277 00:13:27,440 --> 00:13:31,040 are dependent on various sorts of expert systems 278 00:13:31,040 --> 00:13:34,010 or knowledge based systems or systems that were synthesized 279 00:13:34,010 --> 00:13:36,980 by knowledge based systems that control 280 00:13:36,980 --> 00:13:43,415 all sorts of interesting things, and these are right out of AI. 281 00:13:43,415 --> 00:13:44,900 They were so used to the-- 282 00:13:44,900 --> 00:13:47,180 Well, most of them are hidden. 283 00:13:47,180 --> 00:13:50,450 So in fact, if you look at AI in the popular world, 284 00:13:50,450 --> 00:13:52,360 you see novels and whatnot and that-- 285 00:13:52,360 --> 00:13:53,930 I should mention that some of those 286 00:13:53,930 --> 00:13:55,925 were things that got me interested in, 287 00:13:55,925 --> 00:13:58,220 or at least set the stage for me. 288 00:13:58,220 --> 00:14:02,210 But you see commercials where people-- 289 00:14:02,210 --> 00:14:04,550 or ads where people are claiming to have 290 00:14:04,550 --> 00:14:07,230 AI in their toaster or whatever, or in their rice cooker. 291 00:14:07,230 --> 00:14:10,140 292 00:14:10,140 --> 00:14:12,590 Why are you looking at me? 293 00:14:12,590 --> 00:14:16,430 It's not untrue, but it's not-- 294 00:14:16,430 --> 00:14:19,490 it makes people think that, well, that's what AI is. 295 00:14:19,490 --> 00:14:22,055 296 00:14:22,055 --> 00:14:25,550 I mean, from that point of view, my rice cooker knowing 297 00:14:25,550 --> 00:14:27,800 to turn itself off is intelligence, 298 00:14:27,800 --> 00:14:30,620 and that's not really a very good notion. 299 00:14:30,620 --> 00:14:32,690 What we're really interested in is 300 00:14:32,690 --> 00:14:34,520 people who know a lot about something, 301 00:14:34,520 --> 00:14:37,430 or people who know something about everything. 302 00:14:37,430 --> 00:14:39,980 We're in the process of formalizing the entire world's 303 00:14:39,980 --> 00:14:41,720 knowledge, and this has not been done, 304 00:14:41,720 --> 00:14:44,840 not even the expert knowledge that you find in any university 305 00:14:44,840 --> 00:14:46,100 curriculum. 306 00:14:46,100 --> 00:14:49,580 It's going to ripple through all the disciplines, 307 00:14:49,580 --> 00:14:51,750 and eventually, we'll be able to make use of that. 308 00:14:51,750 --> 00:14:54,800 But this is a long process because trying to figure out 309 00:14:54,800 --> 00:14:57,440 how to say things formally in a reasonable way 310 00:14:57,440 --> 00:14:58,430 is something that-- 311 00:14:58,430 --> 00:15:01,710 312 00:15:01,710 --> 00:15:03,310 we know how to get started on it. 313 00:15:03,310 --> 00:15:07,000 We have no idea whether the tools we have right now are 314 00:15:07,000 --> 00:15:08,800 capable of completing the job. 315 00:15:08,800 --> 00:15:11,170 Building on that, what kind of collaboration 316 00:15:11,170 --> 00:15:13,510 do you see with other fields in the near future? 317 00:15:13,510 --> 00:15:15,670 Which fields are they? 318 00:15:15,670 --> 00:15:18,310 Oh, well, you can pick any field and just 319 00:15:18,310 --> 00:15:23,890 try to say, how do we express the knowledge in that field 320 00:15:23,890 --> 00:15:29,320 in formal terms that can be used by a machine to reason 321 00:15:29,320 --> 00:15:32,470 or set up problems in that field, or to do analysis 322 00:15:32,470 --> 00:15:33,250 or so forth. 323 00:15:33,250 --> 00:15:35,950 Right now, this is all manual activity. 324 00:15:35,950 --> 00:15:40,210 It's some person sitting down and worrying about it. 325 00:15:40,210 --> 00:15:42,790 In some cases, you can have some data sets 326 00:15:42,790 --> 00:15:45,230 from which you can extract some bits of the information, 327 00:15:45,230 --> 00:15:46,180 but generally not. 328 00:15:46,180 --> 00:15:48,130 I mean, you can't extract from data 329 00:15:48,130 --> 00:15:51,370 sets the knowledge involved, and if I'm a chemical engineer, 330 00:15:51,370 --> 00:15:52,630 how do I form-- 331 00:15:52,630 --> 00:15:54,400 look at this power plant that somebody 332 00:15:54,400 --> 00:15:57,610 wants me to come consult on, and/or chemical plant, 333 00:15:57,610 --> 00:16:01,150 and figure out which parts are relevant to the task 334 00:16:01,150 --> 00:16:03,640 that they're setting me, keeping it from blowing up 335 00:16:03,640 --> 00:16:08,050 or getting the efficiency higher on something? 336 00:16:08,050 --> 00:16:10,390 I have to know what data to look at, 337 00:16:10,390 --> 00:16:12,670 how to think about the problem, how to decompose it, 338 00:16:12,670 --> 00:16:17,520 and this is not something that you can get out of a data set. 339 00:16:17,520 --> 00:16:19,270 And the only way they can get it right now 340 00:16:19,270 --> 00:16:21,340 is by reading textbooks and talking to experts 341 00:16:21,340 --> 00:16:23,440 and doing a lot of manual work, and it'd be nice 342 00:16:23,440 --> 00:16:25,060 if we could do that quicker. 343 00:16:25,060 --> 00:16:27,700 But that's something which has not 344 00:16:27,700 --> 00:16:30,940 been done for virtually any area for most 345 00:16:30,940 --> 00:16:32,740 areas of human activity, and it's not just 346 00:16:32,740 --> 00:16:33,850 sciences and engineering. 347 00:16:33,850 --> 00:16:39,930 It's how to write well, how to speak well, how to learn-- 348 00:16:39,930 --> 00:16:42,880 do self-help, how to manage your own life. 349 00:16:42,880 --> 00:16:45,010 So what do you think the most important skills 350 00:16:45,010 --> 00:16:47,410 a new researcher in the field should have? 351 00:16:47,410 --> 00:16:50,200 352 00:16:50,200 --> 00:16:54,572 One of them is to always keep trying to understand something. 353 00:16:54,572 --> 00:16:56,280 Don't ever think that just because you've 354 00:16:56,280 --> 00:16:59,050 read a paper or two that you can then sit back, 355 00:16:59,050 --> 00:17:03,510 and you know what you need to know. 356 00:17:03,510 --> 00:17:06,480 Always be working on a problem, because if you have a problem, 357 00:17:06,480 --> 00:17:08,170 that gives you something to think about. 358 00:17:08,170 --> 00:17:09,990 And if you are just thinking of this 359 00:17:09,990 --> 00:17:13,020 as a task of understanding and literature, 360 00:17:13,020 --> 00:17:16,490 you're not going to be as motivated. 361 00:17:16,490 --> 00:17:21,023 I think the more mathematics that you know, the better. 362 00:17:21,023 --> 00:17:23,190 And by that, I don't mean just calculus or something 363 00:17:23,190 --> 00:17:25,357 like that, but there's a lot of advanced mathematics 364 00:17:25,357 --> 00:17:28,290 that is about the structural relations of things. 365 00:17:28,290 --> 00:17:30,960 And I think all that's helpful in different ways 366 00:17:30,960 --> 00:17:36,070 to different people, but I'm also trained-- 367 00:17:36,070 --> 00:17:38,540 I have medical training so that may be just a prejudice, 368 00:17:38,540 --> 00:17:40,970 but I think it helps a lot of people in a lot of ways. 369 00:17:40,970 --> 00:17:44,570 And so you mentioned before that there were some science fiction 370 00:17:44,570 --> 00:17:48,300 authors that had inspired you at the beginning to get 371 00:17:48,300 --> 00:17:49,650 into artificial intelligence? 372 00:17:49,650 --> 00:17:52,000 Or think about artificial intelligence? 373 00:17:52,000 --> 00:17:53,700 Who were they? 374 00:17:53,700 --> 00:17:59,010 The main ones were the Asimov books on robots. 375 00:17:59,010 --> 00:18:03,450 Maybe even more so were Heinlein's The Moon 376 00:18:03,450 --> 00:18:06,233 is a Harsh Mistress was one of my favorite-- 377 00:18:06,233 --> 00:18:07,650 still is one of my favorite books. 378 00:18:07,650 --> 00:18:13,730 379 00:18:13,730 --> 00:18:15,900 Those are the main ones I can think of, but I mean, 380 00:18:15,900 --> 00:18:18,920 I've always been interested in thinking, 381 00:18:18,920 --> 00:18:26,270 and I was lucky that, when I was in college, 382 00:18:26,270 --> 00:18:28,940 thinking was a topic that looked like you could actually 383 00:18:28,940 --> 00:18:29,570 go and study. 384 00:18:29,570 --> 00:18:33,970 385 00:18:33,970 --> 00:18:34,470 Cool. 386 00:18:34,470 --> 00:18:35,540 Well, thank you so much. 387 00:18:35,540 --> 00:18:37,340 Oh, you're welcome. 388 00:18:37,340 --> 00:18:41,000