1 00:00:00,000 --> 00:00:04,750 2 00:00:04,750 --> 00:00:07,990 So thank you for coming. 3 00:00:07,990 --> 00:00:10,725 Why don't we start out by just, why don't you 4 00:00:10,725 --> 00:00:14,620 tell me a little bit about yourself, what your background 5 00:00:14,620 --> 00:00:16,828 is, and what you're doing now? 6 00:00:16,828 --> 00:00:17,620 What I'm doing now? 7 00:00:17,620 --> 00:00:20,230 Well, I've been at Yale University 8 00:00:20,230 --> 00:00:23,260 since I graduated from MIT in 1976. 9 00:00:23,260 --> 00:00:25,885 I've been teaching computer science ever since 10 00:00:25,885 --> 00:00:34,840 and doing AI research, first in areas having to do with logic. 11 00:00:34,840 --> 00:00:37,870 So I did a lot in the foundations 12 00:00:37,870 --> 00:00:41,800 of nonmonotonic logic in the '80s, things 13 00:00:41,800 --> 00:00:44,310 like truth maintenance systems. 14 00:00:44,310 --> 00:00:50,910 And then starting in the '90s, I began to get back to the-- 15 00:00:50,910 --> 00:00:52,920 well, I guess I did this in the '80s, as well-- 16 00:00:52,920 --> 00:00:56,100 started to get back to work on automated planning, 17 00:00:56,100 --> 00:00:58,140 which was things I'd-- 18 00:00:58,140 --> 00:00:59,850 that was the topic of my thesis. 19 00:00:59,850 --> 00:01:04,530 And like many people, I was sick of the topic when 20 00:01:04,530 --> 00:01:07,270 I finished my thesis, so it took me a while to get back to it. 21 00:01:07,270 --> 00:01:10,320 But I've been doing a lot of work in that area since. 22 00:01:10,320 --> 00:01:17,730 And I began to get disillusioned with logic in the late '80s 23 00:01:17,730 --> 00:01:22,540 and decided that robotics would be the foundation of AI, 24 00:01:22,540 --> 00:01:25,270 and I still think that's probably true. 25 00:01:25,270 --> 00:01:33,570 So I made a shift in that area for about 10 years. 26 00:01:33,570 --> 00:01:39,710 But around 2000, I stopped work in robotics, partly because 27 00:01:39,710 --> 00:01:43,490 of health problems, partly because the infrastructure 28 00:01:43,490 --> 00:01:47,840 requires too much effort to keep going, 29 00:01:47,840 --> 00:01:52,350 and it's much easier to do purely software work. 30 00:01:52,350 --> 00:01:57,410 So I've been working on things like knowledge representation 31 00:01:57,410 --> 00:02:00,010 for the semantic web for the last few years, 32 00:02:00,010 --> 00:02:02,120 in addition to the automated planning work. 33 00:02:02,120 --> 00:02:03,230 Cool, very cool. 34 00:02:03,230 --> 00:02:06,220 And what were you doing at MIT, originally? 35 00:02:06,220 --> 00:02:08,789 You mean how did I get there? 36 00:02:08,789 --> 00:02:11,730 Yeah, what attracted you to the institution, 37 00:02:11,730 --> 00:02:13,320 what did you study? 38 00:02:13,320 --> 00:02:15,902 I was there as an undergraduate, and I 39 00:02:15,902 --> 00:02:17,610 was interested in artificial intelligence 40 00:02:17,610 --> 00:02:18,840 long before I got to MIT. 41 00:02:18,840 --> 00:02:20,340 Ever since I was a little kid, I was 42 00:02:20,340 --> 00:02:23,520 interested in artificial intelligence. 43 00:02:23,520 --> 00:02:28,593 I don't know why, exactly, but I've just 44 00:02:28,593 --> 00:02:31,260 been fascinated by the idea that everything about the human mind 45 00:02:31,260 --> 00:02:33,970 can be duplicated in a machine. 46 00:02:33,970 --> 00:02:40,558 And so pretty much knew that's what I was 47 00:02:40,558 --> 00:02:41,850 going to do in graduate school. 48 00:02:41,850 --> 00:02:47,130 And I started hanging around the AI lab when I was a senior, 49 00:02:47,130 --> 00:02:54,420 and I went into a program where I got a simultaneous master's 50 00:02:54,420 --> 00:02:57,660 and bachelor's degree after five or six years, 51 00:02:57,660 --> 00:02:58,608 or something like the. 52 00:02:58,608 --> 00:02:59,108 Great, yeah. 53 00:02:59,108 --> 00:03:00,840 I think it still exists. 54 00:03:00,840 --> 00:03:05,350 55 00:03:05,350 --> 00:03:09,160 And I was interested-- so I got there and was hanging around 56 00:03:09,160 --> 00:03:10,000 with Sussman. 57 00:03:10,000 --> 00:03:11,990 This was around '73? 58 00:03:11,990 --> 00:03:15,790 No, 1970, really, I think. 59 00:03:15,790 --> 00:03:20,500 And I was hanging around with people like Sussman and Terry 60 00:03:20,500 --> 00:03:26,230 Winograd, and they were working on micro-planner, 61 00:03:26,230 --> 00:03:29,722 which was a small version of Carl Hewitt's planner language. 62 00:03:29,722 --> 00:03:31,180 And for some reason, people thought 63 00:03:31,180 --> 00:03:36,910 this would be an important vehicle for AI work, 64 00:03:36,910 --> 00:03:41,440 and it was loosely derivative from predicate calculus. 65 00:03:41,440 --> 00:03:43,190 It was really prologue, and really written 66 00:03:43,190 --> 00:03:45,690 in prologue, if you think of it like that. 67 00:03:45,690 --> 00:03:48,520 68 00:03:48,520 --> 00:03:50,620 So we hacked on that, and Sussman 69 00:03:50,620 --> 00:03:51,880 was finishing up his thesis. 70 00:03:51,880 --> 00:03:56,851 71 00:03:56,851 --> 00:03:58,570 What was he like? 72 00:03:58,570 --> 00:03:59,390 Gerry? 73 00:03:59,390 --> 00:03:59,890 Mhm. 74 00:03:59,890 --> 00:04:02,440 Exactly the way he is now. 75 00:04:02,440 --> 00:04:06,580 He was very energetic. 76 00:04:06,580 --> 00:04:14,615 And he became my advisor for my PhD thesis. 77 00:04:14,615 --> 00:04:19,260 78 00:04:19,260 --> 00:04:24,960 And I decided to do it in the area of electronic circuit 79 00:04:24,960 --> 00:04:26,560 design. 80 00:04:26,560 --> 00:04:28,450 I don't know why I did this. 81 00:04:28,450 --> 00:04:32,615 I was really interested in planning and in organization 82 00:04:32,615 --> 00:04:36,900 of behavior by the computer. 83 00:04:36,900 --> 00:04:39,450 And on top of that, I was interested 84 00:04:39,450 --> 00:04:40,750 in meta-level reasoning. 85 00:04:40,750 --> 00:04:47,820 So I thought that the computer would attack a problem, 86 00:04:47,820 --> 00:04:51,000 and it would use, think of it as something 87 00:04:51,000 --> 00:04:53,310 where it had to make a plan, so it would break it down 88 00:04:53,310 --> 00:04:54,820 into substeps. 89 00:04:54,820 --> 00:04:56,400 And if it got into trouble, it would 90 00:04:56,400 --> 00:04:59,160 ascend to a higher level and reason 91 00:04:59,160 --> 00:05:05,550 about the problem it was tackling using meta rules. 92 00:05:05,550 --> 00:05:10,490 Randy Davis from Stanford had finished his dissertation, 93 00:05:10,490 --> 00:05:12,350 I think, around 1974, or something. 94 00:05:12,350 --> 00:05:15,410 And he had made heavy use of meta rules 95 00:05:15,410 --> 00:05:17,000 for the MYCIN system. 96 00:05:17,000 --> 00:05:17,540 OK, yeah. 97 00:05:17,540 --> 00:05:18,715 So we were all really-- 98 00:05:18,715 --> 00:05:20,090 That was for medical information? 99 00:05:20,090 --> 00:05:22,490 --intoxicated with this kind of meta stuff. 100 00:05:22,490 --> 00:05:24,740 Yes, the MYCIN system was an expert system 101 00:05:24,740 --> 00:05:27,970 for prescribing antibiotics. 102 00:05:27,970 --> 00:05:31,262 103 00:05:31,262 --> 00:05:35,330 So it decided what antibiotic to prescribe for a given patient. 104 00:05:35,330 --> 00:05:37,910 And he added meta rules to the system. 105 00:05:37,910 --> 00:05:40,540 106 00:05:40,540 --> 00:05:43,670 And when you say meta rules, what do you mean by that? 107 00:05:43,670 --> 00:05:47,000 Rules for reasoning about an application's rules. 108 00:05:47,000 --> 00:05:47,590 For example? 109 00:05:47,590 --> 00:05:51,460 110 00:05:51,460 --> 00:05:53,210 I can't think of an example, off hand. 111 00:05:53,210 --> 00:05:56,960 112 00:05:56,960 --> 00:06:01,145 But that's the problem. 113 00:06:01,145 --> 00:06:03,770 114 00:06:03,770 --> 00:06:06,830 I really decided that the whole thing had been misguided, 115 00:06:06,830 --> 00:06:14,600 that what I really got out of my experience 116 00:06:14,600 --> 00:06:18,800 on working on my dissertation was that what's really 117 00:06:18,800 --> 00:06:21,620 important is an algorithm. 118 00:06:21,620 --> 00:06:29,000 That is, m the problem with the work, 119 00:06:29,000 --> 00:06:33,850 the early work in AI in that era was, we really all-- 120 00:06:33,850 --> 00:06:35,950 we were all-- 121 00:06:35,950 --> 00:06:37,240 we had GPS envy. 122 00:06:37,240 --> 00:06:40,360 We really wanted to invent GPS, the general problem solver. 123 00:06:40,360 --> 00:06:42,610 OK, yes. 124 00:06:42,610 --> 00:06:46,298 Which had already been invented, but we thought, well, 125 00:06:46,298 --> 00:06:47,090 we could do better. 126 00:06:47,090 --> 00:06:49,210 We could do better than that. 127 00:06:49,210 --> 00:06:52,060 It reflected Newell and Simon's assumptions 128 00:06:52,060 --> 00:06:53,440 about problem, spaces, and things 129 00:06:53,440 --> 00:06:56,120 like this that we didn't necessarily think 130 00:06:56,120 --> 00:06:57,120 were the right approach. 131 00:06:57,120 --> 00:07:00,463 132 00:07:00,463 --> 00:07:02,130 But that's kind of what we wanted to do. 133 00:07:02,130 --> 00:07:04,770 So there was this idea that meta reasoning 134 00:07:04,770 --> 00:07:15,125 is, that somehow this would magically make things general. 135 00:07:15,125 --> 00:07:16,170 [INAUDIBLE] 136 00:07:16,170 --> 00:07:16,670 What? 137 00:07:16,670 --> 00:07:19,640 Oh, it's OK. 138 00:07:19,640 --> 00:07:25,625 But in other words, here I had a circuit design system. 139 00:07:25,625 --> 00:07:27,860 That was what it was supposed to do. 140 00:07:27,860 --> 00:07:30,860 But I was really focusing on all of the machinery, the plan 141 00:07:30,860 --> 00:07:33,920 library, and the meta reasoning, and the deductive system, 142 00:07:33,920 --> 00:07:36,500 and the unification algorithm, and all this stuff. 143 00:07:36,500 --> 00:07:42,110 I really didn't focus at all on the problem of circuit, 144 00:07:42,110 --> 00:07:43,940 electronic circuit design. 145 00:07:43,940 --> 00:07:46,388 And so that didn't really work very well. 146 00:07:46,388 --> 00:07:47,930 Some of the ideas that came out of it 147 00:07:47,930 --> 00:07:49,430 were later quite influential. 148 00:07:49,430 --> 00:07:53,643 I think the phrase hierarchical task network, 149 00:07:53,643 --> 00:07:55,310 I didn't use hierarchical, but I believe 150 00:07:55,310 --> 00:07:56,650 I called it a task network. 151 00:07:56,650 --> 00:07:59,610 152 00:07:59,610 --> 00:08:03,990 I mean, it was probably the, or one of the first programs 153 00:08:03,990 --> 00:08:09,480 that ever had a circuit diagram, or a circuit design system 154 00:08:09,480 --> 00:08:13,110 on a computer, right? 155 00:08:13,110 --> 00:08:14,470 I really don't know. 156 00:08:14,470 --> 00:08:15,850 OK. 157 00:08:15,850 --> 00:08:18,150 I mean, circuit designers everywhere use those now. 158 00:08:18,150 --> 00:08:19,220 Yes. 159 00:08:19,220 --> 00:08:24,400 Yes, but it wasn't, as I said, circuit design 160 00:08:24,400 --> 00:08:27,480 wasn't really what it was about. 161 00:08:27,480 --> 00:08:36,559 So it was really about HDM planning and organizing, 162 00:08:36,559 --> 00:08:38,890 planning information, planning knowledge. 163 00:08:38,890 --> 00:08:41,450 164 00:08:41,450 --> 00:08:47,500 But well, it wasn't, you couldn't point to an algorithm 165 00:08:47,500 --> 00:08:50,950 and say, the algorithm in this, this algorithm 166 00:08:50,950 --> 00:08:52,253 does the following. 167 00:08:52,253 --> 00:08:54,760 168 00:08:54,760 --> 00:08:57,640 That would have been not general enough. 169 00:08:57,640 --> 00:09:00,178 Generality was what we were obsessed with. 170 00:09:00,178 --> 00:09:01,720 I think it's sort of interesting when 171 00:09:01,720 --> 00:09:06,670 you say how you didn't want to just, 172 00:09:06,670 --> 00:09:09,040 you all had GPS envy a little bit, 173 00:09:09,040 --> 00:09:11,830 because it speaks a little bit about the evolution 174 00:09:11,830 --> 00:09:12,400 of the field. 175 00:09:12,400 --> 00:09:15,260 Because the field was so young at that point. 176 00:09:15,260 --> 00:09:18,880 So if you have any-- do you have any insights about how 177 00:09:18,880 --> 00:09:21,220 the field has been developing? 178 00:09:21,220 --> 00:09:22,870 Do you think it's still in its infancy, 179 00:09:22,870 --> 00:09:25,984 or where are we in our artificial intelligence 180 00:09:25,984 --> 00:09:28,400 research? 181 00:09:28,400 --> 00:09:31,450 No, I don't think we're in the infancy at all. 182 00:09:31,450 --> 00:09:34,580 I think we're in the-- 183 00:09:34,580 --> 00:09:39,270 I think the field is really mature, 184 00:09:39,270 --> 00:09:46,888 in the sense that I know a lot of people 185 00:09:46,888 --> 00:09:48,680 might be disappointed by this, but I really 186 00:09:48,680 --> 00:09:49,930 swung to the opposite extreme. 187 00:09:49,930 --> 00:09:53,840 That is, I tend to think that the human mind is 188 00:09:53,840 --> 00:09:57,230 a collection of modules, each of which 189 00:09:57,230 --> 00:09:58,520 solves a fairly small problem. 190 00:09:58,520 --> 00:10:04,090 191 00:10:04,090 --> 00:10:07,960 People often are apologetic about the fact that 192 00:10:07,960 --> 00:10:11,650 our algorithms, each of them will solve some 193 00:10:11,650 --> 00:10:15,670 narrowly-defined problem, but they don't-- 194 00:10:15,670 --> 00:10:20,380 this is something that Ray Kurzweil, 195 00:10:20,380 --> 00:10:23,540 he makes this distinction in his most recent book, 196 00:10:23,540 --> 00:10:25,660 The Singularity is Near, between when 197 00:10:25,660 --> 00:10:32,360 he calls narrow AI and strong AI, perhaps an unfortunate 198 00:10:32,360 --> 00:10:32,860 phrase. 199 00:10:32,860 --> 00:10:34,277 Because [INAUDIBLE] used strong AI 200 00:10:34,277 --> 00:10:36,490 to mean something completely different. 201 00:10:36,490 --> 00:10:39,430 But narrow AI, in Kurzweil's view, 202 00:10:39,430 --> 00:10:41,590 is these little algorithms for solving 203 00:10:41,590 --> 00:10:50,980 things like speech recognition, or stereo vision, 204 00:10:50,980 --> 00:10:53,870 or classical planning, or something, 205 00:10:53,870 --> 00:11:00,430 these algorithms that make no pretense to solving 206 00:11:00,430 --> 00:11:03,130 all of intelligence. 207 00:11:03,130 --> 00:11:06,893 And so he imagines that at some point, 208 00:11:06,893 --> 00:11:09,310 this other thing will come into being, although he doesn't 209 00:11:09,310 --> 00:11:10,518 discuss how that will happen. 210 00:11:10,518 --> 00:11:12,370 And I think everybody in the field sort of 211 00:11:12,370 --> 00:11:16,720 is disappointed that this other thing hasn't come into being. 212 00:11:16,720 --> 00:11:24,370 So a lot of the discussions here at this session, this meeting 213 00:11:24,370 --> 00:11:30,055 of the Fellows a lot of these, every time 214 00:11:30,055 --> 00:11:31,930 I go to a Fellows meeting, people talk about, 215 00:11:31,930 --> 00:11:36,370 how are we going to get this general-purpose intelligence? 216 00:11:36,370 --> 00:11:40,630 What's going to cause us to make this transition 217 00:11:40,630 --> 00:11:46,620 to a broad intelligence that can solve a wide range of problems, 218 00:11:46,620 --> 00:11:49,070 like a person? 219 00:11:49,070 --> 00:11:51,760 And I think it's going to happen gradually, 220 00:11:51,760 --> 00:11:53,260 without anybody really noticing it. 221 00:11:53,260 --> 00:11:57,730 That is, one module after another is going to plug in. 222 00:11:57,730 --> 00:12:00,220 And as they do this-- 223 00:12:00,220 --> 00:12:02,080 They'll all just be integrated together. 224 00:12:02,080 --> 00:12:03,970 Well, we'll gradually be thinking-- 225 00:12:03,970 --> 00:12:09,140 we'll be thinking of machines more and more as being 226 00:12:09,140 --> 00:12:11,900 intelligent, not the way, initially, not the way 227 00:12:11,900 --> 00:12:16,490 people are, but not the way animals 228 00:12:16,490 --> 00:12:21,050 are, either, but sort of like a weird artificial animal. 229 00:12:21,050 --> 00:12:22,550 It's like, you don't expect your dog 230 00:12:22,550 --> 00:12:24,073 to be aware of everything you do, 231 00:12:24,073 --> 00:12:25,490 and you won't expect your computer 232 00:12:25,490 --> 00:12:27,523 to be aware of everything that's going on. 233 00:12:27,523 --> 00:12:29,690 But you'll expect it to be aware of a lot of things. 234 00:12:29,690 --> 00:12:32,180 You walk into a room, and it will see you, 235 00:12:32,180 --> 00:12:38,270 and it will recognize you, which may be a little spooky. 236 00:12:38,270 --> 00:12:39,500 Probably. 237 00:12:39,500 --> 00:12:43,340 Well, you want to be able to turn this off, obviously. 238 00:12:43,340 --> 00:12:50,300 But the point is that speech recognition has 239 00:12:50,300 --> 00:12:53,150 been coming along gradually. 240 00:12:53,150 --> 00:12:57,230 And the other day, I had a conversation 241 00:12:57,230 --> 00:13:01,430 with an automated system that was gathering information 242 00:13:01,430 --> 00:13:06,223 on a Sears dishwasher repair call. 243 00:13:06,223 --> 00:13:07,640 I was very impressed with the fact 244 00:13:07,640 --> 00:13:12,277 that it would say, what appliance are you 245 00:13:12,277 --> 00:13:12,860 calling about? 246 00:13:12,860 --> 00:13:15,320 And I said, dishwasher. 247 00:13:15,320 --> 00:13:18,190 And it said, your dishwasher is giving you a problem. 248 00:13:18,190 --> 00:13:22,940 I thought that's quite interesting. 249 00:13:22,940 --> 00:13:26,570 Obviously, this is something I'm going to want in my computer, 250 00:13:26,570 --> 00:13:29,360 that I can carry on a conversation with it, 251 00:13:29,360 --> 00:13:33,440 and not about some arbitrary topic. 252 00:13:33,440 --> 00:13:37,010 But I can tell it something about what's going on, 253 00:13:37,010 --> 00:13:46,130 and it'll supplement the usual mouse-touchpad interface. 254 00:13:46,130 --> 00:13:51,320 And as these sensory channels come online, 255 00:13:51,320 --> 00:13:53,930 we'll just gradually take for granted 256 00:13:53,930 --> 00:13:57,140 that computers have more and more abilities, even 257 00:13:57,140 --> 00:13:59,510 though we're not talking about passing the Turing test, 258 00:13:59,510 --> 00:14:01,260 or something of that sort. 259 00:14:01,260 --> 00:14:04,410 260 00:14:04,410 --> 00:14:11,250 And so at some point, I think the computer 261 00:14:11,250 --> 00:14:15,310 will pass the Turing test, some version of it. 262 00:14:15,310 --> 00:14:19,860 But we'll realize that in some ways, the Turing test is not, 263 00:14:19,860 --> 00:14:23,280 it's not really a test of general-purpose intelligence. 264 00:14:23,280 --> 00:14:28,890 That is, if you listen to human conversations, 265 00:14:28,890 --> 00:14:34,180 I think there's probably a lot of people 266 00:14:34,180 --> 00:14:37,510 who couldn't pass Turing's test, in the sense 267 00:14:37,510 --> 00:14:39,735 that they couldn't pretend to be an academic. 268 00:14:39,735 --> 00:14:41,565 You know what I mean? 269 00:14:41,565 --> 00:14:43,690 One problem with this field is that everybody in it 270 00:14:43,690 --> 00:14:47,890 is an academic, or has a PhD, or whatever, 271 00:14:47,890 --> 00:14:51,280 and they tend to have an odd idea of what it 272 00:14:51,280 --> 00:14:53,260 is that human intelligence is. 273 00:14:53,260 --> 00:14:58,990 And so if you're really trying to duplicate 274 00:14:58,990 --> 00:15:03,670 human intelligence, you might-- 275 00:15:03,670 --> 00:15:09,180 I mean, for instance, an example I've used recently 276 00:15:09,180 --> 00:15:11,160 is, supposed you're trying to carry 277 00:15:11,160 --> 00:15:18,260 on a conversation about sports with a man on a bus. 278 00:15:18,260 --> 00:15:21,980 I've listened to, occasionally, two men talking about sports 279 00:15:21,980 --> 00:15:25,070 on busses, or places like that. 280 00:15:25,070 --> 00:15:27,080 And the conversations are really quite shallow. 281 00:15:27,080 --> 00:15:29,960 If you could do speech recognition and have 282 00:15:29,960 --> 00:15:34,400 access to a database about sports statistics. 283 00:15:34,400 --> 00:15:36,650 A robot that could go to a baseball game 284 00:15:36,650 --> 00:15:39,362 and be indistinguishable from-- 285 00:15:39,362 --> 00:15:41,820 Well, going to the game and observing is a different thing, 286 00:15:41,820 --> 00:15:44,840 but carrying on conversation. 287 00:15:44,840 --> 00:15:46,100 The people in the bleachers. 288 00:15:46,100 --> 00:15:46,800 What? 289 00:15:46,800 --> 00:15:48,890 We could carry on conversation with the-- 290 00:15:48,890 --> 00:15:50,830 No, it's important that they be on a bus. 291 00:15:50,830 --> 00:15:52,040 OK. 292 00:15:52,040 --> 00:15:52,790 This is the thing. 293 00:15:52,790 --> 00:15:54,400 It's important. 294 00:15:54,400 --> 00:15:56,150 You've got to define the problem narrowly. 295 00:15:56,150 --> 00:15:57,833 Because you're sitting down, and they 296 00:15:57,833 --> 00:15:59,000 don't know that [INAUDIBLE]. 297 00:15:59,000 --> 00:15:59,875 They're sitting down. 298 00:15:59,875 --> 00:16:02,490 The context is-- you have to pay attention 299 00:16:02,490 --> 00:16:03,590 to the facial expressions. 300 00:16:03,590 --> 00:16:05,100 That's important. 301 00:16:05,100 --> 00:16:08,480 You have to catch certain nuances, and things like this. 302 00:16:08,480 --> 00:16:10,520 You have to be aware of emotions. 303 00:16:10,520 --> 00:16:13,230 People get very emotional about certain topics. 304 00:16:13,230 --> 00:16:21,920 And it's not an easy problem, but on the other hand, 305 00:16:21,920 --> 00:16:24,350 you can keep the conversation focused. 306 00:16:24,350 --> 00:16:26,893 That is, if you try to change the topic-- 307 00:16:26,893 --> 00:16:29,060 well, you could talk about the weather or something, 308 00:16:29,060 --> 00:16:30,320 I suppose. 309 00:16:30,320 --> 00:16:34,700 But there's certain topics you just couldn't raise. 310 00:16:34,700 --> 00:16:36,600 People would get annoyed with you, 311 00:16:36,600 --> 00:16:38,330 or they'd look baffled, or something. 312 00:16:38,330 --> 00:16:42,380 You couldn't start talking about cold fusion. 313 00:16:42,380 --> 00:16:45,320 You couldn't say, what do you think about cold fusion? 314 00:16:45,320 --> 00:16:47,390 Do you think it's going to actually happen? 315 00:16:47,390 --> 00:16:50,330 They're going to-- they would get 316 00:16:50,330 --> 00:16:51,935 uncomfortable, or something. 317 00:16:51,935 --> 00:16:54,340 [INAUDIBLE] 318 00:16:54,340 --> 00:17:00,380 The point is that that's what they want to talk about, 319 00:17:00,380 --> 00:17:02,300 is sports, or something like that. 320 00:17:02,300 --> 00:17:06,670 And so it's socially important. 321 00:17:06,670 --> 00:17:09,760 Now, I think it's much harder to pass, 322 00:17:09,760 --> 00:17:14,710 to do the equivalent thing with women, I think, 323 00:17:14,710 --> 00:17:18,069 because women talk about much deeper topics 324 00:17:18,069 --> 00:17:19,670 having to do with human relationships. 325 00:17:19,670 --> 00:17:22,839 So that would, I mean, I don't think it's impossible. 326 00:17:22,839 --> 00:17:24,940 On a bus? 327 00:17:24,940 --> 00:17:27,520 In any context, I think you'd have 328 00:17:27,520 --> 00:17:37,930 to have more than just access to a database about sports 329 00:17:37,930 --> 00:17:39,590 statistics, or something. 330 00:17:39,590 --> 00:17:43,200 It's just a different set of social conventions. 331 00:17:43,200 --> 00:17:45,850 And maybe I believe it would be easier to do it with men, 332 00:17:45,850 --> 00:17:47,690 because I'm a man, so I think I have more insight into it. 333 00:17:47,690 --> 00:17:48,232 I don't know. 334 00:17:48,232 --> 00:17:58,320 But what I'm saying is that I don't 335 00:17:58,320 --> 00:18:01,380 think there's ever going to be some magic moment where people 336 00:18:01,380 --> 00:18:03,255 say, oh, we've passed some threshold, 337 00:18:03,255 --> 00:18:08,550 and all of a sudden computers are intelligent, 338 00:18:08,550 --> 00:18:12,780 or we've finally figured out this notion 339 00:18:12,780 --> 00:18:15,820 of general-purpose intelligence, strong AI, 340 00:18:15,820 --> 00:18:20,610 whatever you want to call it. 341 00:18:20,610 --> 00:18:26,930 I think even the smartest people are surprisingly narrow, 342 00:18:26,930 --> 00:18:28,280 in a lot of ways. 343 00:18:28,280 --> 00:18:32,690 It's true they can make a lot of connections. 344 00:18:32,690 --> 00:18:37,220 We tend to think of somebody as being really creative if they 345 00:18:37,220 --> 00:18:38,930 can do things that we can't do. 346 00:18:38,930 --> 00:18:45,532 So you know, if you're wrestling with a problem, 347 00:18:45,532 --> 00:18:47,490 and you go to talk to somebody, and that person 348 00:18:47,490 --> 00:18:49,980 can make connections you couldn't make, you say, 349 00:18:49,980 --> 00:18:52,320 oh, that's a creative person. 350 00:18:52,320 --> 00:18:54,540 I really admire that person. 351 00:18:54,540 --> 00:18:55,740 He's very smart. 352 00:18:55,740 --> 00:18:58,890 Whereas if somebody comes to you and asks you to do something, 353 00:18:58,890 --> 00:19:01,480 and you come up with a connection, and you just think, 354 00:19:01,480 --> 00:19:04,620 well, this is just a mental trick I always use. 355 00:19:04,620 --> 00:19:06,780 I don't know what it is, but I just, I 356 00:19:06,780 --> 00:19:09,660 tend to do these things. 357 00:19:09,660 --> 00:19:11,760 It's like Feynman in one of his books 358 00:19:11,760 --> 00:19:15,330 says, well, when somebody-- 359 00:19:15,330 --> 00:19:16,770 I have a little trick I use, which 360 00:19:16,770 --> 00:19:20,400 is when somebody comes to me and starts talking about a theory, 361 00:19:20,400 --> 00:19:22,170 I try to think of an example. 362 00:19:22,170 --> 00:19:25,110 And as they elaborate it, I elaborate my example 363 00:19:25,110 --> 00:19:26,830 to fit their case. 364 00:19:26,830 --> 00:19:29,880 So they're talking about some general-purpose, general laws 365 00:19:29,880 --> 00:19:32,160 of physics, I'll be thinking about 366 00:19:32,160 --> 00:19:36,370 a particular physical situation where those laws would apply. 367 00:19:36,370 --> 00:19:38,790 So he's talking about the theory of billiard balls. 368 00:19:38,790 --> 00:19:41,820 I'm thinking about two particular balls 369 00:19:41,820 --> 00:19:44,250 and how they would bounce off of each other, or something 370 00:19:44,250 --> 00:19:45,520 like that. 371 00:19:45,520 --> 00:19:48,750 And then when they ask a question, 372 00:19:48,750 --> 00:19:50,790 I'll think about my example. 373 00:19:50,790 --> 00:19:54,570 Now, for Feynman, it's just a simple trick. 374 00:19:54,570 --> 00:20:00,510 For somebody else, thinking of the right physical situation 375 00:20:00,510 --> 00:20:02,250 and being able to do this in their heads 376 00:20:02,250 --> 00:20:04,780 while the person is talking, it's 377 00:20:04,780 --> 00:20:08,308 something they think of as magic, perhaps. 378 00:20:08,308 --> 00:20:10,350 But if you actually could open up Feynman's brain 379 00:20:10,350 --> 00:20:16,110 and see how he did it, you'd see, ah, yes, he happens-- 380 00:20:16,110 --> 00:20:20,580 his brain happened to develop in a certain way 381 00:20:20,580 --> 00:20:23,760 so that it tended to make these inferences 382 00:20:23,760 --> 00:20:28,180 in a certain situation that other people wouldn't make. 383 00:20:28,180 --> 00:20:31,095 So if we want to duplicate Feynman, we can do that. 384 00:20:31,095 --> 00:20:33,965 385 00:20:33,965 --> 00:20:34,965 We just add this module. 386 00:20:34,965 --> 00:20:41,100 387 00:20:41,100 --> 00:20:44,364 But still, that was focused-- 388 00:20:44,364 --> 00:20:46,710 it's a particular kind of knowledge that he had. 389 00:20:46,710 --> 00:20:50,650 And he was very, very good at it, 390 00:20:50,650 --> 00:20:54,630 and everybody admired him, justifiably. 391 00:20:54,630 --> 00:21:00,740 But it doesn't mean that he had some set 392 00:21:00,740 --> 00:21:03,350 of general-purpose techniques that 393 00:21:03,350 --> 00:21:06,290 transcended the sorts of things we're doing now. 394 00:21:06,290 --> 00:21:09,920 I think that the idea is that people are exploring now, 395 00:21:09,920 --> 00:21:11,840 and the statistical methods people 396 00:21:11,840 --> 00:21:18,320 are using, the discovery that if you 397 00:21:18,320 --> 00:21:21,700 are willing to optimize your search algorithms, 398 00:21:21,700 --> 00:21:25,760 you can often attack exponential problems 399 00:21:25,760 --> 00:21:28,940 and have a surprising degree of success. 400 00:21:28,940 --> 00:21:34,810 These ideas, which aren't really that earthshaking, 401 00:21:34,810 --> 00:21:36,810 are being applied to a wide variety of problems, 402 00:21:36,810 --> 00:21:38,738 and they're solving them one after another. 403 00:21:38,738 --> 00:21:40,280 And I think that's where the field is 404 00:21:40,280 --> 00:21:42,240 going to continue to go. 405 00:21:42,240 --> 00:21:47,340 So that's why I think it's mature. 406 00:21:47,340 --> 00:21:52,140 I don't think that we're still waiting for the big, the really 407 00:21:52,140 --> 00:21:53,250 good idea to come along. 408 00:21:53,250 --> 00:21:54,192 Cool. 409 00:21:54,192 --> 00:21:56,550 That sound good to me. 410 00:21:56,550 --> 00:21:58,910 So switching gears back to my MIT, 411 00:21:58,910 --> 00:22:04,150 because we want to just hear about what the atmosphere was 412 00:22:04,150 --> 00:22:11,336 like and capture some of what was going on in the past, 413 00:22:11,336 --> 00:22:13,110 cause we are doing a history project. 414 00:22:13,110 --> 00:22:13,830 Yes, right. 415 00:22:13,830 --> 00:22:17,040 Well, the atmosphere was terrific. 416 00:22:17,040 --> 00:22:19,800 This was from '71 to '76. 417 00:22:19,800 --> 00:22:23,130 418 00:22:23,130 --> 00:22:24,960 It turned completely sour when I actually 419 00:22:24,960 --> 00:22:28,770 had to start working on my dissertation. 420 00:22:28,770 --> 00:22:29,620 That spoiled it. 421 00:22:29,620 --> 00:22:31,440 I know what you mean, yeah. 422 00:22:31,440 --> 00:22:34,200 But up to that point, though, it was great. 423 00:22:34,200 --> 00:22:35,700 We would sit around in the playroom. 424 00:22:35,700 --> 00:22:37,733 I Mean, every day we would-- 425 00:22:37,733 --> 00:22:39,150 I just learned about the playroom. 426 00:22:39,150 --> 00:22:40,620 I had never heard of it before. 427 00:22:40,620 --> 00:22:42,690 It was this open area. 428 00:22:42,690 --> 00:22:44,860 The offices were all around it. 429 00:22:44,860 --> 00:22:48,150 And this is the fifth floor of Tech Square, I guess. 430 00:22:48,150 --> 00:22:49,710 The offices were around it. 431 00:22:49,710 --> 00:22:51,900 There were beanbag chairs. 432 00:22:51,900 --> 00:22:55,950 So we'd come out of our offices and sit around talking 433 00:22:55,950 --> 00:22:58,710 about technical topics. 434 00:22:58,710 --> 00:23:05,520 And the people that I worked most closely with 435 00:23:05,520 --> 00:23:10,830 were Bob Moore and Scott Fomin, and we were all thinking 436 00:23:10,830 --> 00:23:13,390 about inference and planning. 437 00:23:13,390 --> 00:23:21,130 And so we just bounced ideas off each other. 438 00:23:21,130 --> 00:23:23,860 Did people have similar sources of inspiration about 439 00:23:23,860 --> 00:23:27,610 why they were interested in artificial intelligence, 440 00:23:27,610 --> 00:23:29,257 or what drove people? 441 00:23:29,257 --> 00:23:31,940 442 00:23:31,940 --> 00:23:33,320 What drove us? 443 00:23:33,320 --> 00:23:38,652 Well, as I said we, were looking for-- 444 00:23:38,652 --> 00:23:40,110 we thought we were going to come up 445 00:23:40,110 --> 00:23:44,190 with some algorithms that would explain all of intelligence, 446 00:23:44,190 --> 00:23:45,160 really. 447 00:23:45,160 --> 00:23:48,240 I believe we thought that sensory perception wasn't 448 00:23:48,240 --> 00:23:56,460 so important, that you could get the data in and convert it 449 00:23:56,460 --> 00:23:58,290 to a formal notation. 450 00:23:58,290 --> 00:24:00,820 Then after that, you just did some reasoning. 451 00:24:00,820 --> 00:24:02,788 You would come to some conclusions, 452 00:24:02,788 --> 00:24:04,830 and those conclusions, then, would be shipped out 453 00:24:04,830 --> 00:24:09,792 to the motors, into to the end defectors, 454 00:24:09,792 --> 00:24:11,520 and that would be that. 455 00:24:11,520 --> 00:24:18,880 A similar story underlay the Shakey project, 456 00:24:18,880 --> 00:24:20,390 the robot at Stanford. 457 00:24:20,390 --> 00:24:23,610 So even on the West Coast, they had some similar ideas. 458 00:24:23,610 --> 00:24:26,760 We were actually, Bob Moore and I especially, 459 00:24:26,760 --> 00:24:30,780 were opposed to the ideology at MIT 460 00:24:30,780 --> 00:24:33,640 which was sort of anti-logic. 461 00:24:33,640 --> 00:24:36,570 There was a big controversy about procedural 462 00:24:36,570 --> 00:24:39,340 versus declarative knowledge representation. 463 00:24:39,340 --> 00:24:43,560 Everybody thought this was of earthshaking importance. 464 00:24:43,560 --> 00:24:51,570 And so I was a convert to the idea 465 00:24:51,570 --> 00:24:55,770 that logical representations were important, declarative 466 00:24:55,770 --> 00:24:57,180 representations. 467 00:24:57,180 --> 00:25:01,440 Bob Moore, I guess, converted me to that idea. 468 00:25:01,440 --> 00:25:06,750 Because they just had a lot of cool properties. 469 00:25:06,750 --> 00:25:08,230 You could study them, whereas with 470 00:25:08,230 --> 00:25:10,937 procedural representations, it's just programs. 471 00:25:10,937 --> 00:25:12,270 What can you say about programs? 472 00:25:12,270 --> 00:25:13,853 You can organize them in various ways, 473 00:25:13,853 --> 00:25:17,470 but you actually have to write them. 474 00:25:17,470 --> 00:25:22,022 Whereas with logic, you could just think about, 475 00:25:22,022 --> 00:25:23,730 inference algorithms, you didn't actually 476 00:25:23,730 --> 00:25:26,530 have to sit down and write the axioms. 477 00:25:26,530 --> 00:25:29,380 478 00:25:29,380 --> 00:25:35,240 So one of us, Bob Moore or I, invented the bomb 479 00:25:35,240 --> 00:25:36,200 in the toilet problem. 480 00:25:36,200 --> 00:25:37,908 I guess you could say we did it together. 481 00:25:37,908 --> 00:25:42,870 But I thought he invented it, and he thought I invented it. 482 00:25:42,870 --> 00:25:45,020 And so when we compared notes years later, 483 00:25:45,020 --> 00:25:46,850 we realized we had no idea who invented it. 484 00:25:46,850 --> 00:25:48,267 But the bomb in the toilet problem 485 00:25:48,267 --> 00:25:59,240 is this problem that you're given two bomb-shaped objects, 486 00:25:59,240 --> 00:26:01,580 and one of them is a bomb. 487 00:26:01,580 --> 00:26:07,070 And you have, as your only method for neutralizing bombs, 488 00:26:07,070 --> 00:26:08,540 to put it in the toilet. 489 00:26:08,540 --> 00:26:09,980 So what do you do? 490 00:26:09,980 --> 00:26:13,790 Well, the solution is to put both in the toilet. 491 00:26:13,790 --> 00:26:16,880 The reason this was important is because it's 492 00:26:16,880 --> 00:26:21,800 a counterexample to the idea that solving the problem 493 00:26:21,800 --> 00:26:24,560 requires merely proving that there 494 00:26:24,560 --> 00:26:26,910 exists a course of action. 495 00:26:26,910 --> 00:26:31,030 So if you think of this as a prologue problem, 496 00:26:31,030 --> 00:26:35,940 then you might think of it as, you have a free variable, which 497 00:26:35,940 --> 00:26:36,900 is a course of action. 498 00:26:36,900 --> 00:26:40,260 You want to solve, find a course of action 499 00:26:40,260 --> 00:26:44,460 such that if you carry it out, you'll be safe. 500 00:26:44,460 --> 00:26:47,040 And yes, you can prove, there is a course of action. 501 00:26:47,040 --> 00:26:49,590 It's either put bomb A in the toilet, 502 00:26:49,590 --> 00:26:52,285 or it's put bomb B in the toilet. 503 00:26:52,285 --> 00:26:53,910 But all you've done is prove that there 504 00:26:53,910 --> 00:26:55,050 exists a course of action. 505 00:26:55,050 --> 00:27:00,180 You haven't found it, done the one that 506 00:27:00,180 --> 00:27:02,280 will actually protect you. 507 00:27:02,280 --> 00:27:05,000 So it's a counterexample to that simple idea. 508 00:27:05,000 --> 00:27:07,860 509 00:27:07,860 --> 00:27:12,150 And so that's the sort of thing we would think about. 510 00:27:12,150 --> 00:27:15,400 511 00:27:15,400 --> 00:27:19,532 Throughout my career, I've been very good at coming up 512 00:27:19,532 --> 00:27:20,115 with examples. 513 00:27:20,115 --> 00:27:22,731 514 00:27:22,731 --> 00:27:23,590 Important skill. 515 00:27:23,590 --> 00:27:24,407 Huh? 516 00:27:24,407 --> 00:27:25,490 That's an important skill. 517 00:27:25,490 --> 00:27:26,180 I guess. 518 00:27:26,180 --> 00:27:30,440 But so the bottom of the toilet problem, the little mel 519 00:27:30,440 --> 00:27:31,097 problem. 520 00:27:31,097 --> 00:27:32,930 People come up with proposals, and I come up 521 00:27:32,930 --> 00:27:37,260 with examples that show that won't work, which is, I guess, 522 00:27:37,260 --> 00:27:41,630 a small contribution to the progress of the field. 523 00:27:41,630 --> 00:27:47,240 But so in the late '80s, I came up 524 00:27:47,240 --> 00:27:50,540 with what got named the Yale Shooting Problem. 525 00:27:50,540 --> 00:27:52,410 I'm not sure why. 526 00:27:52,410 --> 00:27:58,350 For some reason, I tend to name things after Yale. 527 00:27:58,350 --> 00:28:00,180 I wish I called it the McDermott, 528 00:28:00,180 --> 00:28:01,960 or the McDermott-Hanks shooting problem. 529 00:28:01,960 --> 00:28:04,050 Steve Hanks was my graduate student, 530 00:28:04,050 --> 00:28:10,070 and he and I wrote a paper about this in the late '80s. 531 00:28:10,070 --> 00:28:14,900 And this problem derailed nonmonotonic logic 532 00:28:14,900 --> 00:28:18,250 for a few years. 533 00:28:18,250 --> 00:28:24,955 So anyway, so that's the sort of thing 534 00:28:24,955 --> 00:28:29,170 we thought we should toss around back in the '70s, 535 00:28:29,170 --> 00:28:30,670 in the AI lab. 536 00:28:30,670 --> 00:28:34,580 And it was just a tremendous amount of fun. 537 00:28:34,580 --> 00:28:37,720 I mean, I always tell incoming grad students 538 00:28:37,720 --> 00:28:44,020 that this is going to be the happiest time of your life, 539 00:28:44,020 --> 00:28:46,840 until you have to write your thesis. 540 00:28:46,840 --> 00:28:48,550 And after that, you have to get a job, 541 00:28:48,550 --> 00:28:53,717 and you'll be doing a lot of distracting chores 542 00:28:53,717 --> 00:28:54,800 for the rest of your life. 543 00:28:54,800 --> 00:28:58,660 So enjoy graduate school, because it's really the peak. 544 00:28:58,660 --> 00:28:59,890 Not everybody feels that way. 545 00:28:59,890 --> 00:29:02,020 Do you have any other advice? 546 00:29:02,020 --> 00:29:05,500 Do you have any other advice for new researchers in the field? 547 00:29:05,500 --> 00:29:06,370 Any other advice? 548 00:29:06,370 --> 00:29:09,820 Or aspiring? 549 00:29:09,820 --> 00:29:10,690 In AI? 550 00:29:10,690 --> 00:29:13,720 Well, one piece of advice I always 551 00:29:13,720 --> 00:29:18,400 give students is, pick a very simple problem, 552 00:29:18,400 --> 00:29:21,040 and then make it even simpler, because it will still 553 00:29:21,040 --> 00:29:23,050 be too hard, usually. 554 00:29:23,050 --> 00:29:32,480 So AI, really, to my mind is, in many cases, 555 00:29:32,480 --> 00:29:35,810 is a methodology for attacking things, 556 00:29:35,810 --> 00:29:38,930 or that it is a set of techniques. 557 00:29:38,930 --> 00:29:41,540 One big difference between then and now 558 00:29:41,540 --> 00:29:45,290 is that we thought we were discovering new techniques 559 00:29:45,290 --> 00:29:46,460 for solving problems. 560 00:29:46,460 --> 00:29:51,048 And really, a lot of those things are still around, 561 00:29:51,048 --> 00:29:51,965 but they're very weak. 562 00:29:51,965 --> 00:29:56,373 563 00:29:56,373 --> 00:29:57,790 For example, if you take something 564 00:29:57,790 --> 00:30:01,030 like Kalman filtering, we would say, oh, my God. 565 00:30:01,030 --> 00:30:02,690 How uninteresting is that? 566 00:30:02,690 --> 00:30:04,600 It's a statistical technique. 567 00:30:04,600 --> 00:30:06,100 It only works if things are linear. 568 00:30:06,100 --> 00:30:07,750 It's for signal processing. 569 00:30:07,750 --> 00:30:11,470 570 00:30:11,470 --> 00:30:15,550 But nowadays, AI is often-- 571 00:30:15,550 --> 00:30:17,950 we often partner with other people. 572 00:30:17,950 --> 00:30:19,320 We're studying signal. 573 00:30:19,320 --> 00:30:21,923 We borrow signal processing techniques. 574 00:30:21,923 --> 00:30:23,590 We're doing things in the life sciences. 575 00:30:23,590 --> 00:30:28,600 576 00:30:28,600 --> 00:30:33,815 A lot of people are doing game theory with e-commerce. 577 00:30:33,815 --> 00:30:35,190 People are working in e-commerce. 578 00:30:35,190 --> 00:30:40,390 So now we constantly borrow techniques, problems, 579 00:30:40,390 --> 00:30:44,780 algorithms from other areas. 580 00:30:44,780 --> 00:30:47,803 What distinguishes AI is that we study 581 00:30:47,803 --> 00:30:49,720 problems that are harder than other people are 582 00:30:49,720 --> 00:30:51,783 willing to study. 583 00:30:51,783 --> 00:30:53,950 We're not afraid of the fact that these problems are 584 00:30:53,950 --> 00:30:59,660 intractable, and our method is to try something and see 585 00:30:59,660 --> 00:31:00,160 if it works. 586 00:31:00,160 --> 00:31:04,840 587 00:31:04,840 --> 00:31:08,630 And if it does, then we explore that space for a while. 588 00:31:08,630 --> 00:31:12,980 So if you look at things like SAT solvers, 589 00:31:12,980 --> 00:31:19,250 who would have tried to solve SAT except an AI person? 590 00:31:19,250 --> 00:31:23,258 It's the intractable problem. 591 00:31:23,258 --> 00:31:24,550 Theorists would say, that's it. 592 00:31:24,550 --> 00:31:25,633 I proved it's intractable. 593 00:31:25,633 --> 00:31:27,190 I'm not interested anymore. 594 00:31:27,190 --> 00:31:29,890 But an AI person says, I'm going to try it. 595 00:31:29,890 --> 00:31:33,010 So they optimized the hell out of these things. 596 00:31:33,010 --> 00:31:35,410 And then Moore's law comes along and makes 597 00:31:35,410 --> 00:31:38,265 it possible to really look at the big problems. 598 00:31:38,265 --> 00:31:39,640 And all of a sudden, you discover 599 00:31:39,640 --> 00:31:42,970 there's this incredibly intricate space 600 00:31:42,970 --> 00:31:45,760 of hard problems, easy problems. 601 00:31:45,760 --> 00:31:48,790 You discover phase transitions in space. 602 00:31:48,790 --> 00:31:52,400 You discover that there are lots and lots of special cases. 603 00:31:52,400 --> 00:31:54,850 So AI people are discovering these things 604 00:31:54,850 --> 00:32:00,010 and spitting out problems for theorists to explore for years. 605 00:32:00,010 --> 00:32:06,160 And it's often hard for us to justify what we've done. 606 00:32:06,160 --> 00:32:11,560 We have to prove, empirically, that it's useful. 607 00:32:11,560 --> 00:32:13,940 It's often not easy. 608 00:32:13,940 --> 00:32:17,410 So it's this willingness to tackle very hard problems 609 00:32:17,410 --> 00:32:21,205 with empirical methods, often with [INAUDIBLE] 610 00:32:21,205 --> 00:32:21,950 in other areas. 611 00:32:21,950 --> 00:32:22,783 It's not techniques. 612 00:32:22,783 --> 00:32:25,230 It's not subject matter. 613 00:32:25,230 --> 00:32:26,000 Great. 614 00:32:26,000 --> 00:32:27,690 Well, thank you so much. 615 00:32:27,690 --> 00:32:29,350 You're welcome. 616 00:32:29,350 --> 00:32:30,000