1 00:00:00,000 --> 00:00:04,940 2 00:00:04,940 --> 00:00:06,790 Professor at MIT. 3 00:00:06,790 --> 00:00:08,710 That's correct. 4 00:00:08,710 --> 00:00:13,360 So start with the starting question. 5 00:00:13,360 --> 00:00:15,670 What got you into the field of artificial intelligence? 6 00:00:15,670 --> 00:00:24,600 I've been interested in how I think since I was a small kid. 7 00:00:24,600 --> 00:00:28,030 And for many years I was searching for that field which 8 00:00:28,030 --> 00:00:31,390 would provide some answers to the questions I was asking. 9 00:00:31,390 --> 00:00:34,030 So I tried neurobiology. 10 00:00:34,030 --> 00:00:35,800 I tried information theory. 11 00:00:35,800 --> 00:00:39,040 And then one day I attended a lecture by Marvin Minsky, 12 00:00:39,040 --> 00:00:41,110 and that was the end of the story. 13 00:00:41,110 --> 00:00:41,920 I knew where I was. 14 00:00:41,920 --> 00:00:44,660 15 00:00:44,660 --> 00:00:48,360 I guess, what was not there in the other disciplines 16 00:00:48,360 --> 00:00:49,450 you investigated? 17 00:00:49,450 --> 00:00:53,410 A computational understanding of the nature of intelligence. 18 00:00:53,410 --> 00:00:57,100 Not just a catalog of behavior, but an explanation of it. 19 00:00:57,100 --> 00:00:59,470 Something that reaches all the way down 20 00:00:59,470 --> 00:01:01,990 from the highest levels of reflective 21 00:01:01,990 --> 00:01:06,640 thought to the kind of thinking that's closely aligned with 22 00:01:06,640 --> 00:01:10,210 and touching of our perceptions. 23 00:01:10,210 --> 00:01:13,390 OK, that makes sense. 24 00:01:13,390 --> 00:01:15,478 So you took a course with Marvin Minsky? 25 00:01:15,478 --> 00:01:16,770 Or you just attended a lecture? 26 00:01:16,770 --> 00:01:18,082 What course? 27 00:01:18,082 --> 00:01:19,540 Oh, I've forgotten the name, but it 28 00:01:19,540 --> 00:01:21,910 was the graduate artificial intelligence course, 29 00:01:21,910 --> 00:01:25,990 which I subsequently translated into an undergraduate version 30 00:01:25,990 --> 00:01:27,700 and have been teaching ever since. 31 00:01:27,700 --> 00:01:34,180 Cool, so I guess you ended up working in the AI lab? 32 00:01:34,180 --> 00:01:40,342 Yes, on the basis of the term paper I wrote for that-- 33 00:01:40,342 --> 00:01:41,800 something that Marvin was teaching, 34 00:01:41,800 --> 00:01:46,370 he invited me to spend the summer 35 00:01:46,370 --> 00:01:47,990 in the artificial Intelligence lab, 36 00:01:47,990 --> 00:01:50,660 and I was of course thrilled beyond description 37 00:01:50,660 --> 00:01:51,750 and never left. 38 00:01:51,750 --> 00:01:53,750 What sort of things were going on when you first 39 00:01:53,750 --> 00:01:55,520 arrived at the AI lab? 40 00:01:55,520 --> 00:02:01,260 Building robots, making computers see, 41 00:02:01,260 --> 00:02:03,810 building descriptions of things, trying 42 00:02:03,810 --> 00:02:06,090 to understand how computers can learn 43 00:02:06,090 --> 00:02:08,490 in ways that were different from the way rats 44 00:02:08,490 --> 00:02:11,008 learn from repeated trials. 45 00:02:11,008 --> 00:02:13,050 All that sort of thing was tremendously exciting, 46 00:02:13,050 --> 00:02:16,710 and especially so because the kind of hardware 47 00:02:16,710 --> 00:02:19,560 that people like Tom Knight and Richard Greenblatt 48 00:02:19,560 --> 00:02:22,720 were building and programming was unique in the world, 49 00:02:22,720 --> 00:02:25,020 and we felt that nobody else had anything 50 00:02:25,020 --> 00:02:29,850 like the kinds of tools we had to work on the problems. 51 00:02:29,850 --> 00:02:32,150 List machines? 52 00:02:32,150 --> 00:02:34,250 Even before list machines, yeah. 53 00:02:34,250 --> 00:02:35,550 List machines came along. 54 00:02:35,550 --> 00:02:38,490 A chess machine was there. 55 00:02:38,490 --> 00:02:44,640 But even the beginning, those guys knew how to put together. 56 00:02:44,640 --> 00:02:46,170 Didn't have displays before anybody 57 00:02:46,170 --> 00:02:47,420 had heard of bit map displays. 58 00:02:47,420 --> 00:02:50,540 59 00:02:50,540 --> 00:02:55,490 Ethernet-like connectivity and email systems, all of which 60 00:02:55,490 --> 00:02:58,592 were pioneered and championed by the early days 61 00:02:58,592 --> 00:03:00,050 of the artificial Intelligence lab. 62 00:03:00,050 --> 00:03:02,360 OK, sweet. 63 00:03:02,360 --> 00:03:04,370 I suppose I should add, especially the time 64 00:03:04,370 --> 00:03:05,058 sharing system. 65 00:03:05,058 --> 00:03:06,600 It was a fabulous time sharing system 66 00:03:06,600 --> 00:03:08,840 that made it possible for a lot of people 67 00:03:08,840 --> 00:03:12,200 to do good work on a computer that by today's standards, 68 00:03:12,200 --> 00:03:14,750 of course, would probably be about the sort of thing 69 00:03:14,750 --> 00:03:18,030 you would think to put in a watch. 70 00:03:18,030 --> 00:03:21,030 Fair enough. 71 00:03:21,030 --> 00:03:23,880 So I guess you-- 72 00:03:23,880 --> 00:03:27,260 what was your first project when you went to the AI lab? 73 00:03:27,260 --> 00:03:30,650 Oh, I did some things on computer vision. 74 00:03:30,650 --> 00:03:34,130 I did an analysis of what it would look like, 75 00:03:34,130 --> 00:03:36,440 how a computer could recognize that the objects had 76 00:03:36,440 --> 00:03:38,010 a hole in it. 77 00:03:38,010 --> 00:03:41,390 But pretty quickly I was on to my PhD thesis 78 00:03:41,390 --> 00:03:44,360 topic, which was on the subject of how a computer can learn. 79 00:03:44,360 --> 00:03:46,220 Right. 80 00:03:46,220 --> 00:03:51,290 I suppose you probably haven't seen our project page, 81 00:03:51,290 --> 00:03:54,020 but the first film in the collection we have 82 00:03:54,020 --> 00:03:57,380 is Professor Minsky talking about his thesis 83 00:03:57,380 --> 00:03:58,520 with the [INAUDIBLE]. 84 00:03:58,520 --> 00:04:01,302 85 00:04:01,302 --> 00:04:02,760 Often is the case that someone else 86 00:04:02,760 --> 00:04:04,190 can explain it better you can. 87 00:04:04,190 --> 00:04:06,890 88 00:04:06,890 --> 00:04:07,970 Yeah. 89 00:04:07,970 --> 00:04:09,508 Whenever we explain anything, we're 90 00:04:09,508 --> 00:04:11,300 hallucinating that the other person already 91 00:04:11,300 --> 00:04:12,800 knows what we're about to tell them, 92 00:04:12,800 --> 00:04:15,020 so it's not necessarily the best person 93 00:04:15,020 --> 00:04:16,541 to get the explanation from. 94 00:04:16,541 --> 00:04:17,541 You're right about that. 95 00:04:17,541 --> 00:04:20,430 96 00:04:20,430 --> 00:04:26,010 So I guess, where did your work go from there? 97 00:04:26,010 --> 00:04:29,810 Well, it's always been a focus on-- 98 00:04:29,810 --> 00:04:31,835 to some extent, accidentally, I suppose-- always 99 00:04:31,835 --> 00:04:34,340 been focused on problems having to do with learning. 100 00:04:34,340 --> 00:04:37,900 101 00:04:37,900 --> 00:04:40,750 And to back up a little bit and add something to that earlier 102 00:04:40,750 --> 00:04:42,500 discussion-- you might want to add it in-- 103 00:04:42,500 --> 00:04:44,830 I think that the main characteristic 104 00:04:44,830 --> 00:04:47,830 of those early days and in the Artificial Intelligence 105 00:04:47,830 --> 00:04:51,160 Laboratory was a tremendous excitement that 106 00:04:51,160 --> 00:04:56,380 was derived partly from the monster brains who 107 00:04:56,380 --> 00:04:59,440 are working there, and partly from the general mission. 108 00:04:59,440 --> 00:05:01,000 We really felt that we were going 109 00:05:01,000 --> 00:05:03,790 to be able to develop a computational theory 110 00:05:03,790 --> 00:05:06,610 of intelligence, and do some very deep scientific work 111 00:05:06,610 --> 00:05:08,920 that no one would ever be able to do again 112 00:05:08,920 --> 00:05:13,770 because once you discovered it, you discovered it. 113 00:05:13,770 --> 00:05:14,610 Makes sense. 114 00:05:14,610 --> 00:05:19,560 I guess we've heard a lot about the atmosphere of the AI lab 115 00:05:19,560 --> 00:05:23,070 in the early days of its existence. 116 00:05:23,070 --> 00:05:24,990 Do you have any particular anecdotes 117 00:05:24,990 --> 00:05:28,380 or cool stories you remember from when you were there? 118 00:05:28,380 --> 00:05:28,880 No. 119 00:05:28,880 --> 00:05:30,040 No? 120 00:05:30,040 --> 00:05:31,460 Just general work. 121 00:05:31,460 --> 00:05:34,200 All of them are-- 122 00:05:34,200 --> 00:05:36,004 all of them are good. 123 00:05:36,004 --> 00:05:38,896 Oh, OK. 124 00:05:38,896 --> 00:05:40,350 Fair enough. 125 00:05:40,350 --> 00:05:48,790 So after the arch thesis, you still 126 00:05:48,790 --> 00:05:51,550 work on sort of machine learning. 127 00:05:51,550 --> 00:05:54,070 Still do, yes. 128 00:05:54,070 --> 00:05:56,920 Of course, the field has changed a great deal 129 00:05:56,920 --> 00:05:59,020 since those early days. 130 00:05:59,020 --> 00:06:02,680 Partly a natural consequence of a field 131 00:06:02,680 --> 00:06:05,920 growing up and maturing, people working in it, 132 00:06:05,920 --> 00:06:06,730 standards changing. 133 00:06:06,730 --> 00:06:09,550 134 00:06:09,550 --> 00:06:11,410 And to some extent, I think that the field 135 00:06:11,410 --> 00:06:15,190 has suffered a bit from the disappointments 136 00:06:15,190 --> 00:06:17,020 of the early days. 137 00:06:17,020 --> 00:06:21,250 The problems proved to be much harder than anyone anticipated. 138 00:06:21,250 --> 00:06:24,040 Early progress was so rapid that there was a general sense 139 00:06:24,040 --> 00:06:26,350 that we'd be able to solve all the problems 140 00:06:26,350 --> 00:06:29,782 within a decade or two, but we weren't able to do that. 141 00:06:29,782 --> 00:06:31,240 And as a consequence, I think there 142 00:06:31,240 --> 00:06:34,240 was a kind of general and glacial drift 143 00:06:34,240 --> 00:06:37,030 away from addressing scientific questions, 144 00:06:37,030 --> 00:06:42,220 and toward developing leading edge applications in computer 145 00:06:42,220 --> 00:06:44,020 science. 146 00:06:44,020 --> 00:06:49,160 And that glacial drift over a long period of time, of course, 147 00:06:49,160 --> 00:06:53,800 has a big impact on who goes into the field and what kinds 148 00:06:53,800 --> 00:06:55,390 of things that they do. 149 00:06:55,390 --> 00:07:01,750 I'm pleased to say, though, that I think that these days there 150 00:07:01,750 --> 00:07:05,710 is emerging a new sense of enthusiasm, 151 00:07:05,710 --> 00:07:12,130 or a new sense of expectation, a new belief that we can grapple 152 00:07:12,130 --> 00:07:14,830 with the scientific questions and develop what we all 153 00:07:14,830 --> 00:07:16,480 originally set out to develop, which 154 00:07:16,480 --> 00:07:19,865 is a computational theory of human level intelligence 155 00:07:19,865 --> 00:07:20,365 and beyond. 156 00:07:20,365 --> 00:07:23,240 157 00:07:23,240 --> 00:07:28,300 So what do you think of-- 158 00:07:28,300 --> 00:07:30,940 I mean, so at the Triple AI conference 159 00:07:30,940 --> 00:07:34,900 there was sort of a fair amount of complaining 160 00:07:34,900 --> 00:07:38,300 about everything's statistical these days. 161 00:07:38,300 --> 00:07:41,860 There are all sorts of fads that have appeared and disappeared 162 00:07:41,860 --> 00:07:44,140 in artificial intelligence over the past 50 years, 163 00:07:44,140 --> 00:07:46,510 and that will continue to happen, I think, 164 00:07:46,510 --> 00:07:49,660 until we have a much better sense of what 165 00:07:49,660 --> 00:07:51,890 the answers really look like. 166 00:07:51,890 --> 00:07:57,010 Certainly those statistical approaches have merit. 167 00:07:57,010 --> 00:07:59,380 They're especially valuable in applications. 168 00:07:59,380 --> 00:08:01,720 They shed light on some scientific questions, 169 00:08:01,720 --> 00:08:08,680 but like all the other popular approaches, 170 00:08:08,680 --> 00:08:10,640 they're a tool, not the answer. 171 00:08:10,640 --> 00:08:13,120 So OK, makes sense. 172 00:08:13,120 --> 00:08:15,430 The real answer lies in the observation 173 00:08:15,430 --> 00:08:19,325 that intelligence is the product of thinking 174 00:08:19,325 --> 00:08:21,700 with our perceptions, and thinking with our descriptions, 175 00:08:21,700 --> 00:08:25,000 and a blurry line that separates the two. 176 00:08:25,000 --> 00:08:27,970 But thinking that is backed in all cases up and down the line 177 00:08:27,970 --> 00:08:33,340 by tying a couple of loops, by an incredible capability 178 00:08:33,340 --> 00:08:39,090 for accumulating, analyzing, and making use of regularities 179 00:08:39,090 --> 00:08:42,970 in what we experience, and not all of that 180 00:08:42,970 --> 00:08:46,270 is going to be susceptible to attack 181 00:08:46,270 --> 00:08:49,490 by any one particular method. 182 00:08:49,490 --> 00:08:52,780 So professor McCarthy said his opinion was 183 00:08:52,780 --> 00:08:55,030 that within the hardware of the machines 184 00:08:55,030 --> 00:09:00,390 we had in the 70s, human level intelligence is possible. 185 00:09:00,390 --> 00:09:03,930 He said that it's entirely not hardware dependent. 186 00:09:03,930 --> 00:09:05,820 All the problems remaining to be solved 187 00:09:05,820 --> 00:09:08,530 are software breakthroughs. 188 00:09:08,530 --> 00:09:09,930 What do you think? 189 00:09:09,930 --> 00:09:18,100 I get-- I think that we will need some new hardware, 190 00:09:18,100 --> 00:09:20,680 but we don't know what it will look like. 191 00:09:20,680 --> 00:09:23,410 But we do know that whatever it will look like, 192 00:09:23,410 --> 00:09:24,430 Tom Knight can build it. 193 00:09:24,430 --> 00:09:27,230 194 00:09:27,230 --> 00:09:27,730 Fair enough. 195 00:09:27,730 --> 00:09:30,370 I think our thinking about the problem 196 00:09:30,370 --> 00:09:33,160 has not been severely handicapped 197 00:09:33,160 --> 00:09:37,390 by the hardware of the day, and I pick my words carefully 198 00:09:37,390 --> 00:09:41,980 because in the early days, I used to do a little computer 199 00:09:41,980 --> 00:09:42,800 vision research. 200 00:09:42,800 --> 00:09:45,490 And in those early days, it was an expensive operation 201 00:09:45,490 --> 00:09:49,420 to drive a 3 by 3 filter across a television image. 202 00:09:49,420 --> 00:09:51,970 And so you didn't think about driving a 30 by 30 filter 203 00:09:51,970 --> 00:09:55,390 across a much larger range. 204 00:09:55,390 --> 00:09:58,540 That's easy with the kind of hardware we have today, 205 00:09:58,540 --> 00:10:01,180 but we didn't think about it then because it was so far 206 00:10:01,180 --> 00:10:05,110 beyond what we could anticipate having 207 00:10:05,110 --> 00:10:08,200 within our graduate students' lifetime. 208 00:10:08,200 --> 00:10:12,070 That is to say within a time a student will spend [INAUDIBLE].. 209 00:10:12,070 --> 00:10:13,920 So it did influence our thinking a bit, 210 00:10:13,920 --> 00:10:20,740 but aside from that small footnote, 211 00:10:20,740 --> 00:10:24,250 I think that our integrated artificial intelligence has 212 00:10:24,250 --> 00:10:28,540 never been severely handicapped by computational limitations 213 00:10:28,540 --> 00:10:32,800 because we know that if we need massive amounts 214 00:10:32,800 --> 00:10:36,100 of parallel look-up, we can get it. 215 00:10:36,100 --> 00:10:38,350 After all, one of the great reasons for optimism 216 00:10:38,350 --> 00:10:40,690 these days is that computing has become free, 217 00:10:40,690 --> 00:10:43,240 as Tom Knight likes to say. 218 00:10:43,240 --> 00:10:47,110 In the computer I have on my side, 219 00:10:47,110 --> 00:10:50,570 I have a gigabyte of memory. 220 00:10:50,570 --> 00:10:53,350 And when I came into the field of artificial intelligence, 221 00:10:53,350 --> 00:10:54,580 memory was a dollar of bytes. 222 00:10:54,580 --> 00:10:57,700 So there's a billion dollars, unadjusted for inflation, 223 00:10:57,700 --> 00:10:59,890 over there in my laptop. 224 00:10:59,890 --> 00:11:03,950 And years from now when this videotape 225 00:11:03,950 --> 00:11:08,140 is seen, having a gigabyte of memory only in a laptop 226 00:11:08,140 --> 00:11:10,870 will seem quaint, just as thinking of memory 227 00:11:10,870 --> 00:11:14,260 as costing $1 a byte seems quaint today. 228 00:11:14,260 --> 00:11:17,590 So computing in massive quantities 229 00:11:17,590 --> 00:11:21,640 is no longer a cost obstacle, so that's 230 00:11:21,640 --> 00:11:23,200 one of the great reasons why we can 231 00:11:23,200 --> 00:11:24,790 be optimistic about the future. 232 00:11:24,790 --> 00:11:29,830 It no longer requires a big laboratory, a giant contract, 233 00:11:29,830 --> 00:11:37,270 and vast teams of people to do good work in the field. 234 00:11:37,270 --> 00:11:43,630 So people tend to hate questions that are specific, 235 00:11:43,630 --> 00:11:47,443 but I guess, where do you think we're headed, 236 00:11:47,443 --> 00:11:49,110 and when do you think we're going to be? 237 00:11:49,110 --> 00:11:51,360 I'm very optimistic. 238 00:11:51,360 --> 00:11:53,070 I think if you look at where the field 239 00:11:53,070 --> 00:11:54,820 of artificial intelligence is today, 240 00:11:54,820 --> 00:11:58,800 you can say it's at the point where molecular biology was 241 00:11:58,800 --> 00:11:59,940 in about 1950. 242 00:11:59,940 --> 00:12:03,330 So we're about a half a century behind. 243 00:12:03,330 --> 00:12:05,820 But what I mean by that is that we've been in business now 244 00:12:05,820 --> 00:12:08,070 for 50 years. 245 00:12:08,070 --> 00:12:11,460 We've done some fabulous things in the world of applications, 246 00:12:11,460 --> 00:12:15,570 just as molecular biology had, or as biology had in 1950. 247 00:12:15,570 --> 00:12:17,220 They had antibiotics, and we have 248 00:12:17,220 --> 00:12:19,230 various kinds of applications that 249 00:12:19,230 --> 00:12:22,460 populate all bit systems today. 250 00:12:22,460 --> 00:12:23,940 But we're still waiting for analog 251 00:12:23,940 --> 00:12:27,030 to the discovery of structured DNA. 252 00:12:27,030 --> 00:12:32,310 And I think we have accumulated enough knowledge in our field 253 00:12:32,310 --> 00:12:35,310 and in the [INAUDIBLE] fields that that kind of breakthrough 254 00:12:35,310 --> 00:12:38,010 can now be expected. 255 00:12:38,010 --> 00:12:40,500 Whether it will come tomorrow or a decade from now, 256 00:12:40,500 --> 00:12:43,620 I can't say for sure, but I have every confidence 257 00:12:43,620 --> 00:12:51,020 that we are ready for that kind of a great discovery, 258 00:12:51,020 --> 00:12:54,170 after which the world will never be able to look back. 259 00:12:54,170 --> 00:12:56,510 What do you see a human level intelligence 260 00:12:56,510 --> 00:13:01,050 doing for us as either individuals or as a society? 261 00:13:01,050 --> 00:13:06,800 We don't want to get hysterical about those possibilities. 262 00:13:06,800 --> 00:13:08,653 That kind of hysteria comes and goes 263 00:13:08,653 --> 00:13:10,320 in the field of artificial intelligence, 264 00:13:10,320 --> 00:13:18,720 and what we generally find is that the commercial value 265 00:13:18,720 --> 00:13:23,590 of new technology, especially artificial intelligence 266 00:13:23,590 --> 00:13:25,427 technology, is not in displacing people, 267 00:13:25,427 --> 00:13:27,010 but making it possible to do something 268 00:13:27,010 --> 00:13:30,160 you couldn't do before. 269 00:13:30,160 --> 00:13:33,340 Companies that go into business to replace people using 270 00:13:33,340 --> 00:13:35,380 artificial intelligence generally 271 00:13:35,380 --> 00:13:36,970 go broke within a short period of time 272 00:13:36,970 --> 00:13:40,700 because people are pretty hard to replace. 273 00:13:40,700 --> 00:13:43,820 On the other hand, if you can combine people and computers, 274 00:13:43,820 --> 00:13:46,970 and put them together in combinations 275 00:13:46,970 --> 00:13:49,220 that can do something that couldn't be done before, 276 00:13:49,220 --> 00:13:51,530 then you've got a serious business. 277 00:13:51,530 --> 00:13:54,930 And that's the way I think artificial intelligence will 278 00:13:54,930 --> 00:13:59,760 be making itself felt in the years to come-- 279 00:13:59,760 --> 00:14:00,720 many years to come. 280 00:14:00,720 --> 00:14:05,850 And by the time computers are really as smart as people, 281 00:14:05,850 --> 00:14:09,240 we will have had many generations to adjust. 282 00:14:09,240 --> 00:14:11,270 Fair enough. 283 00:14:11,270 --> 00:14:14,270 Do you read science fiction? 284 00:14:14,270 --> 00:14:17,322 No, I make it. 285 00:14:17,322 --> 00:14:18,030 Very good answer. 286 00:14:18,030 --> 00:14:19,220 What do you read? 287 00:14:19,220 --> 00:14:19,970 What do I do? 288 00:14:19,970 --> 00:14:21,480 Do you read? 289 00:14:21,480 --> 00:14:22,560 I like to read history. 290 00:14:22,560 --> 00:14:23,160 History? 291 00:14:23,160 --> 00:14:23,780 Yeah. 292 00:14:23,780 --> 00:14:24,655 What kind of history? 293 00:14:24,655 --> 00:14:27,860 American Civil War history in particular, but just 294 00:14:27,860 --> 00:14:33,230 about any history from-- 295 00:14:33,230 --> 00:14:41,660 histories focused on Roman emperors 296 00:14:41,660 --> 00:14:49,720 to more modern histories of Peter the Great. 297 00:14:49,720 --> 00:14:52,680 Anything that is similar to that. 298 00:14:52,680 --> 00:14:55,058 Fair enough. 299 00:14:55,058 --> 00:14:55,600 Just curious. 300 00:14:55,600 --> 00:15:00,972 Sometimes people have very crazy tastes. 301 00:15:00,972 --> 00:15:02,340 Oh, artificial intelligences. 302 00:15:02,340 --> 00:15:02,850 That's all they read about. 303 00:15:02,850 --> 00:15:04,800 A lot of people in artificial intelligence 304 00:15:04,800 --> 00:15:06,510 are addicted to science fiction. 305 00:15:06,510 --> 00:15:16,653 306 00:15:16,653 --> 00:15:17,810 Fair enough, OK. 307 00:15:17,810 --> 00:15:20,550 Well, I think that should probably do it. 308 00:15:20,550 --> 00:15:24,900 OK, let me think of anything else I want to touch on. 309 00:15:24,900 --> 00:15:26,880 Can I just say a few things you might want 310 00:15:26,880 --> 00:15:30,090 to edit in if you feel like it? 311 00:15:30,090 --> 00:15:34,035 Everyone is always curious about how long it was going to take, 312 00:15:34,035 --> 00:15:38,490 and JCR Licklider always said that we 313 00:15:38,490 --> 00:15:41,820 tend to overestimate what we can do in a year, 314 00:15:41,820 --> 00:15:44,970 and underestimate what we can do in a decade. 315 00:15:44,970 --> 00:15:47,380 And I suppose if we take this problem 316 00:15:47,380 --> 00:15:51,570 of artificial intelligence, we can usefully 317 00:15:51,570 --> 00:15:54,360 paraphrase what Licklider said, and argue 318 00:15:54,360 --> 00:15:58,200 that we tend to overestimate what we can do in a decade 319 00:15:58,200 --> 00:16:01,800 and underestimate what we can do in a century. 320 00:16:01,800 --> 00:16:06,030 We ought to have a great deal of optimism 321 00:16:06,030 --> 00:16:10,200 now because of all the knowledge has been accumulated. 322 00:16:10,200 --> 00:16:11,700 We're a little frustrated as a field 323 00:16:11,700 --> 00:16:17,465 because we haven't solved a big problem yet. 324 00:16:17,465 --> 00:16:19,840 On the other hand, I think we have a much clearer picture 325 00:16:19,840 --> 00:16:21,140 of the way forward. 326 00:16:21,140 --> 00:16:25,030 I think many of us believe that to understand 327 00:16:25,030 --> 00:16:28,510 the nature of human intelligence is 328 00:16:28,510 --> 00:16:31,450 going to require a great deal of capability 329 00:16:31,450 --> 00:16:35,590 for understanding intelligence at every level, 330 00:16:35,590 --> 00:16:37,570 from perceptions, to descriptions, 331 00:16:37,570 --> 00:16:43,510 from the interactions of those, from how we hallucinate images 332 00:16:43,510 --> 00:16:45,520 to help us answer questions. 333 00:16:45,520 --> 00:16:48,070 How, if I were to say to you, John loves Mary. 334 00:16:48,070 --> 00:16:50,980 Did John kiss Mary? 335 00:16:50,980 --> 00:16:53,800 If I were to say to you, John kissed Mary. 336 00:16:53,800 --> 00:16:54,850 Did John touch Mary? 337 00:16:54,850 --> 00:16:57,400 You would, I think, imagine the scene 338 00:16:57,400 --> 00:16:59,950 and read the answer off of it. 339 00:16:59,950 --> 00:17:03,460 So we're beginning to understand now that intelligence is not 340 00:17:03,460 --> 00:17:05,329 the product of a single faculty, but it's 341 00:17:05,329 --> 00:17:10,790 the product of many faculties in our brains 342 00:17:10,790 --> 00:17:11,750 all working together. 343 00:17:11,750 --> 00:17:12,800 We think with our eyes. 344 00:17:12,800 --> 00:17:14,250 We think with our hands. 345 00:17:14,250 --> 00:17:16,470 We think with language. 346 00:17:16,470 --> 00:17:23,520 We think with all parts of our brains 347 00:17:23,520 --> 00:17:27,000 working together, and working in the light of all the experience 348 00:17:27,000 --> 00:17:28,740 that they accumulate. 349 00:17:28,740 --> 00:17:30,330 So when we get the answers, I think 350 00:17:30,330 --> 00:17:34,500 what we'll find is that there's a fabulous engineering 351 00:17:34,500 --> 00:17:39,360 in the brain, and that that engineering has a-- 352 00:17:39,360 --> 00:17:43,572 rests on a foundation of fabulous scientific principle. 353 00:17:43,572 --> 00:17:45,030 When we understand the whole story, 354 00:17:45,030 --> 00:17:47,130 we'll see very sophisticated mechanisms 355 00:17:47,130 --> 00:17:49,290 that are involved in there. 356 00:17:49,290 --> 00:17:51,310 And how long will it take? 357 00:17:51,310 --> 00:17:52,830 I have a great deal of confidence 358 00:17:52,830 --> 00:17:56,580 now because I think that we have such a much clearer idea 359 00:17:56,580 --> 00:18:00,180 of what needs to be done than we used to, 360 00:18:00,180 --> 00:18:09,320 that we can expect in my lifetime to substantially solve 361 00:18:09,320 --> 00:18:12,130 the key problems. 362 00:18:12,130 --> 00:18:12,870 Excellent. 363 00:18:12,870 --> 00:18:13,370 OK. 364 00:18:13,370 --> 00:18:15,520 Well, that's encouraging to hear.