1 00:00:00,000 --> 00:00:03,550 2 00:00:03,550 --> 00:00:05,008 You or should I look at the camera. 3 00:00:05,008 --> 00:00:05,716 However you want. 4 00:00:05,716 --> 00:00:11,505 Oh, it's a conversation, so you can consult the camera. 5 00:00:11,505 --> 00:00:13,630 So why don't you start out by introducing yourself? 6 00:00:13,630 --> 00:00:14,710 I'm Jim Hendler. 7 00:00:14,710 --> 00:00:16,520 I'm at the University of Maryland, 8 00:00:16,520 --> 00:00:19,960 and in January, we'll be starting a new research group 9 00:00:19,960 --> 00:00:22,990 at Rensselaer Polytechnic Institute looking at what's 10 00:00:22,990 --> 00:00:24,790 called the semantic web. 11 00:00:24,790 --> 00:00:29,320 I've been doing AI since mid '70s, 12 00:00:29,320 --> 00:00:37,425 and a lot of the work I've done comes more out 13 00:00:37,425 --> 00:00:39,380 of the scruffy part of the deal. 14 00:00:39,380 --> 00:00:42,700 So making it work rather than the theoretical. 15 00:00:42,700 --> 00:00:47,590 I also spent a few years of DARPA funding AI from 1998 16 00:00:47,590 --> 00:00:48,600 to 2001. 17 00:00:48,600 --> 00:00:50,150 Great, cool. 18 00:00:50,150 --> 00:00:51,690 So how did you get into the field? 19 00:00:51,690 --> 00:00:55,030 20 00:00:55,030 --> 00:00:57,490 So there's a generation of AI researchers-- 21 00:00:57,490 --> 00:01:00,970 I guess, now, we're all in their late 40s or about to turn 50-- 22 00:01:00,970 --> 00:01:07,240 who were all young teenagers when the film 2001 came out. 23 00:01:07,240 --> 00:01:10,540 And even though Hal was not the-- 24 00:01:10,540 --> 00:01:13,173 the supercomputer was not the hero of the movie, 25 00:01:13,173 --> 00:01:15,590 it was, in fact, that villain, none of us saw it that way. 26 00:01:15,590 --> 00:01:20,088 We were just blown away by this concept of-- 27 00:01:20,088 --> 00:01:21,880 most of us had already either had our hands 28 00:01:21,880 --> 00:01:25,287 on computers or calculators or were thinking about some 29 00:01:25,287 --> 00:01:25,870 of this stuff. 30 00:01:25,870 --> 00:01:28,090 But they were still big machines that were somewhere 31 00:01:28,090 --> 00:01:29,890 you had to go, and you can-- 32 00:01:29,890 --> 00:01:31,870 and just the notion of this thing you talked to 33 00:01:31,870 --> 00:01:35,590 and you interacted with, and it could fly a spaceship 34 00:01:35,590 --> 00:01:38,470 and could kill people and stuff, it was just great. 35 00:01:38,470 --> 00:01:39,938 Had it's own personality of sorts? 36 00:01:39,938 --> 00:01:40,480 That's right. 37 00:01:40,480 --> 00:01:42,340 The personality stuff was very good. 38 00:01:42,340 --> 00:01:42,968 That's great. 39 00:01:42,968 --> 00:01:43,510 That's great. 40 00:01:43,510 --> 00:01:47,770 So taking it from movie to reality, 41 00:01:47,770 --> 00:01:50,050 and what was your path? 42 00:01:50,050 --> 00:01:54,400 So I ended up at Yale as an undergraduate and then 43 00:01:54,400 --> 00:01:56,950 as a research assistant in the Yale AI project, 44 00:01:56,950 --> 00:02:02,590 which I guess around '74 or '75, Roger Schank, 45 00:02:02,590 --> 00:02:05,290 who had been a student at Stanford, 46 00:02:05,290 --> 00:02:09,220 moved to Yale to start his own AI group-- 47 00:02:09,220 --> 00:02:11,800 actually, he'd been a postdoc at Stanford-- 48 00:02:11,800 --> 00:02:16,690 to really get looking at natural language. 49 00:02:16,690 --> 00:02:20,118 And he had a very interesting perspective on language, 50 00:02:20,118 --> 00:02:21,910 which is that there would be a small number 51 00:02:21,910 --> 00:02:24,880 of primitive representational objects 52 00:02:24,880 --> 00:02:27,700 that could represent great amounts of language, 53 00:02:27,700 --> 00:02:30,370 and that you could use these to create what were 54 00:02:30,370 --> 00:02:31,780 called scripts in those days. 55 00:02:31,780 --> 00:02:36,392 So what Minsky was doing for the physical world in frames, 56 00:02:36,392 --> 00:02:38,350 Schank said we could do similar sorts of things 57 00:02:38,350 --> 00:02:40,490 for stories, things like that. 58 00:02:40,490 --> 00:02:45,880 So a story, the famous story, was John went to a restaurant. 59 00:02:45,880 --> 00:02:47,020 He ordered lobster. 60 00:02:47,020 --> 00:02:48,610 He ate and left. 61 00:02:48,610 --> 00:02:50,560 And the question answering system asked, 62 00:02:50,560 --> 00:02:52,110 what did John eat? 63 00:02:52,110 --> 00:02:55,450 Should say lobster, but the story doesn't say that. 64 00:02:55,450 --> 00:02:56,330 That's an inference. 65 00:02:56,330 --> 00:02:59,170 And so this was that sort of simple storytelling inference 66 00:02:59,170 --> 00:03:02,610 thing that was very popular in the mid to late '70s. 67 00:03:02,610 --> 00:03:05,470 How far did he get? 68 00:03:05,470 --> 00:03:06,920 It actually went very far. 69 00:03:06,920 --> 00:03:09,190 It started growing in many different ways. 70 00:03:09,190 --> 00:03:11,950 The project I was involved in, Jerry D Young, 71 00:03:11,950 --> 00:03:13,660 who's now at the University of Illinois-- 72 00:03:13,660 --> 00:03:15,160 he couldn't be here-- 73 00:03:15,160 --> 00:03:19,690 was a project called From Fast Reading, Understanding Memory 74 00:03:19,690 --> 00:03:20,830 Program. 75 00:03:20,830 --> 00:03:24,800 And its idea was to summarize news stories as they came by, 76 00:03:24,800 --> 00:03:27,103 and we actually had it reading a news feed. 77 00:03:27,103 --> 00:03:29,770 And you'd get an article, and it would say there was a hurricane 78 00:03:29,770 --> 00:03:31,480 or there was a flood, and it would 79 00:03:31,480 --> 00:03:34,030 try to report there was a flood in such and such a place, 80 00:03:34,030 --> 00:03:36,100 and it had so many casualties. 81 00:03:36,100 --> 00:03:38,050 And it made some very amusing mistakes 82 00:03:38,050 --> 00:03:40,810 and a lot of getting it right. 83 00:03:40,810 --> 00:03:42,760 We were being interviewed one time, 84 00:03:42,760 --> 00:03:45,490 and as Roger was being interviewed, 85 00:03:45,490 --> 00:03:48,850 his screen flashed, and it said there was a flood in Missouri, 86 00:03:48,850 --> 00:03:49,890 one person died. 87 00:03:49,890 --> 00:03:51,820 And he said see, and later we realized 88 00:03:51,820 --> 00:03:55,177 the story was about Congressman Flood, who had died-- 89 00:03:55,177 --> 00:03:57,010 Congressman Flood of Missouri, who had died. 90 00:03:57,010 --> 00:03:59,630 So you get those kinds of things going on, 91 00:03:59,630 --> 00:04:02,080 but mostly, it was really an effort 92 00:04:02,080 --> 00:04:04,420 to pull these things together. 93 00:04:04,420 --> 00:04:07,570 Then the success of students started increasing it, 94 00:04:07,570 --> 00:04:10,900 so Wendy Leonard did a lot on how you could use this approach 95 00:04:10,900 --> 00:04:12,850 to answer questions. 96 00:04:12,850 --> 00:04:16,690 Mallory Selfridge did how we could 97 00:04:16,690 --> 00:04:18,850 use it to do some language learning, 98 00:04:18,850 --> 00:04:22,960 and how it might model how children learned. 99 00:04:22,960 --> 00:04:24,520 It eventually led into what became 100 00:04:24,520 --> 00:04:26,980 known as case based reasoning with people 101 00:04:26,980 --> 00:04:29,590 like [INAUDIBLE] and Chris Hammond, who were both fellows. 102 00:04:29,590 --> 00:04:32,013 So a large number of the fellows-- 103 00:04:32,013 --> 00:04:34,180 who, unfortunately, many of them aren't here today-- 104 00:04:34,180 --> 00:04:38,410 grew from this very different tradition than the formal MIT 105 00:04:38,410 --> 00:04:43,210 language tradition, from this much more AI-ish-- 106 00:04:43,210 --> 00:04:46,990 more influenced by the AI lab than the linguistics department 107 00:04:46,990 --> 00:04:48,400 kind of approached the language. 108 00:04:48,400 --> 00:04:48,970 Interesting. 109 00:04:48,970 --> 00:04:52,570 So tell me more about the atmosphere, the Yale AI lab. 110 00:04:52,570 --> 00:04:54,867 Oh, gosh. 111 00:04:54,867 --> 00:04:56,200 What other things were going on? 112 00:04:56,200 --> 00:04:56,590 What was-- 113 00:04:56,590 --> 00:04:57,130 Yes, sure. 114 00:04:57,130 --> 00:04:58,670 Were there bean bags on the floor? 115 00:04:58,670 --> 00:04:59,170 No. 116 00:04:59,170 --> 00:05:05,900 Well, it was more formal than some of the others, 117 00:05:05,900 --> 00:05:09,130 and less formal than anything today. 118 00:05:09,130 --> 00:05:13,480 I mean, it really was mostly just office type space, 119 00:05:13,480 --> 00:05:19,045 but there was a conference room, and there were things. 120 00:05:19,045 --> 00:05:23,050 What I think Roger Schank was famous for 121 00:05:23,050 --> 00:05:25,480 was starting what became known as the Friday Afternoon 122 00:05:25,480 --> 00:05:30,340 Fights, which was the research-- so it wasn't always traditional 123 00:05:30,340 --> 00:05:32,730 that all the grad students working on something 124 00:05:32,730 --> 00:05:33,990 got together. 125 00:05:33,990 --> 00:05:36,600 And Roger felt it was very important to have a once a week 126 00:05:36,600 --> 00:05:39,180 meeting where everybody would get together, 127 00:05:39,180 --> 00:05:42,660 and they went from-- 128 00:05:42,660 --> 00:05:44,490 was a rough and tumble attitude, let's say. 129 00:05:44,490 --> 00:05:46,620 So if you got past about the second sentence 130 00:05:46,620 --> 00:05:48,750 of your first slide, you were considered 131 00:05:48,750 --> 00:05:50,640 to have succeeded greatly. 132 00:05:50,640 --> 00:05:54,987 And the idea was to provoke ideas, provoke discussion. 133 00:05:54,987 --> 00:05:56,070 It could be your own work. 134 00:05:56,070 --> 00:05:58,360 It could be reporting on other people's work. 135 00:05:58,360 --> 00:06:00,900 So it took the notion of the journal club, which 136 00:06:00,900 --> 00:06:04,470 is a traditional scientific notion of everybody 137 00:06:04,470 --> 00:06:06,420 reading a paper and coming and discussing it. 138 00:06:06,420 --> 00:06:09,600 This was much more of a let's take some kind of research 139 00:06:09,600 --> 00:06:11,580 ideas and really explore and tumble them, 140 00:06:11,580 --> 00:06:15,780 and refutations would come back a few weeks later saying, no, 141 00:06:15,780 --> 00:06:16,992 look, you really can do this. 142 00:06:16,992 --> 00:06:18,034 That's very embarrassing. 143 00:06:18,034 --> 00:06:19,030 [CELL PHONE RINGING] 144 00:06:19,030 --> 00:06:24,323 Let's just turn off this part of the-- 145 00:06:24,323 --> 00:06:25,305 I'm so sorry. 146 00:06:25,305 --> 00:06:28,745 It's OK. 147 00:06:28,745 --> 00:06:29,870 That's really embarrassing. 148 00:06:29,870 --> 00:06:32,780 Just have to rewind. 149 00:06:32,780 --> 00:06:33,870 Film editing. 150 00:06:33,870 --> 00:06:35,120 Computers are wonderful tools. 151 00:06:35,120 --> 00:06:38,930 152 00:06:38,930 --> 00:06:40,100 All right. 153 00:06:40,100 --> 00:06:43,970 My mother called me during the panel yesterday, 154 00:06:43,970 --> 00:06:47,650 and I sat there going, not mine-- 155 00:06:47,650 --> 00:06:49,790 Yeah, people always joked about the funniest place 156 00:06:49,790 --> 00:06:51,320 to pick up your phone and answer it would be 157 00:06:51,320 --> 00:06:52,570 in the middle of a final exam. 158 00:06:52,570 --> 00:06:55,430 I have learned now, one is to turn off 159 00:06:55,430 --> 00:06:57,801 my IM whenever I'm at a panel. 160 00:06:57,801 --> 00:07:01,730 161 00:07:01,730 --> 00:07:03,800 So where were we? 162 00:07:03,800 --> 00:07:05,550 Talking about the Friday Afternoon Fights. 163 00:07:05,550 --> 00:07:07,250 I was saying how these clubs had turned 164 00:07:07,250 --> 00:07:08,840 into much more of a rough and tumble, 165 00:07:08,840 --> 00:07:11,055 but what happened was you'd get an idea. 166 00:07:11,055 --> 00:07:11,930 It would be explored. 167 00:07:11,930 --> 00:07:13,787 People would have counterexamples. 168 00:07:13,787 --> 00:07:15,620 Those would be worked on, and so it actually 169 00:07:15,620 --> 00:07:18,170 made for a very active research area. 170 00:07:18,170 --> 00:07:20,240 And in fact, many of Schank's former students 171 00:07:20,240 --> 00:07:24,140 have gone off elsewhere and kept that notion 172 00:07:24,140 --> 00:07:27,218 of the weekly meeting, a lot of interaction, 173 00:07:27,218 --> 00:07:28,010 that sort of thing. 174 00:07:28,010 --> 00:07:30,470 That's very cool. 175 00:07:30,470 --> 00:07:32,990 So you said you're a much more hands-on, 176 00:07:32,990 --> 00:07:35,240 building things kind of person. 177 00:07:35,240 --> 00:07:36,170 Why do you say that? 178 00:07:36,170 --> 00:07:38,430 Can you give me some examples of research 179 00:07:38,430 --> 00:07:40,940 along the way, or just interests? 180 00:07:40,940 --> 00:07:49,460 Well, so the AI community-- in fact, at the first, I guess, 181 00:07:49,460 --> 00:07:52,220 it was probably the first AAAI conference. 182 00:07:52,220 --> 00:07:54,937 If not, it was the first [INAUDIBLE] Science conference. 183 00:07:54,937 --> 00:07:57,560 184 00:07:57,560 --> 00:08:00,770 Bob Abelson, who was a colleague of Schank's, got up 185 00:08:00,770 --> 00:08:03,980 and gave a talk where he introduced the notion 186 00:08:03,980 --> 00:08:05,585 that different AI people had these two 187 00:08:05,585 --> 00:08:07,460 different approaches-- the neat approach, who 188 00:08:07,460 --> 00:08:10,640 wanted to make everything formal and theoretical, 189 00:08:10,640 --> 00:08:12,680 and the scruffy approach, which is people 190 00:08:12,680 --> 00:08:14,030 who wanted to build and test. 191 00:08:14,030 --> 00:08:17,770 192 00:08:17,770 --> 00:08:20,530 The person who probably had most influence in my thinking 193 00:08:20,530 --> 00:08:24,820 about this was Herb Simon, who said 194 00:08:24,820 --> 00:08:28,420 the coin of the realm in AI is an implemented system. 195 00:08:28,420 --> 00:08:31,690 So you give me a great theory of how something can understand 196 00:08:31,690 --> 00:08:34,720 language, that's not going to be nearly as compelling as when 197 00:08:34,720 --> 00:08:37,539 I'm sitting in front of a system that's talking to me. 198 00:08:37,539 --> 00:08:40,210 Absolutely. 199 00:08:40,210 --> 00:08:42,520 And furthermore, he felt we could make best progress 200 00:08:42,520 --> 00:08:44,049 by defining-- 201 00:08:44,049 --> 00:08:47,320 things we wanted to do and then get it to do that. 202 00:08:47,320 --> 00:08:49,090 And the more different things it could do, 203 00:08:49,090 --> 00:08:50,320 or the better it could do things, 204 00:08:50,320 --> 00:08:52,028 the more you would say it was approaching 205 00:08:52,028 --> 00:08:53,480 some kind of intelligence. 206 00:08:53,480 --> 00:08:56,900 I wish the field had kept more to that in some ways. 207 00:08:56,900 --> 00:08:59,920 So I've always done more coming from that kind of spec 208 00:08:59,920 --> 00:09:01,300 that we've always built systems. 209 00:09:01,300 --> 00:09:02,950 We've always had demos. 210 00:09:02,950 --> 00:09:04,930 We always tried to show our concept 211 00:09:04,930 --> 00:09:07,600 in something that works because making it work 212 00:09:07,600 --> 00:09:10,690 exposes the flaws in your argument. 213 00:09:10,690 --> 00:09:14,110 If we're publishing it as a theoretical piece of work, 214 00:09:14,110 --> 00:09:16,570 really, it's very hard to probe what's 215 00:09:16,570 --> 00:09:17,800 going on around the edges. 216 00:09:17,800 --> 00:09:19,210 Absolutely. 217 00:09:19,210 --> 00:09:22,270 So you mentioned you were at ARPA for a period of time? 218 00:09:22,270 --> 00:09:24,657 Yes, so I was there-- 219 00:09:24,657 --> 00:09:26,490 so ARPA and DARPA change names periodically. 220 00:09:26,490 --> 00:09:27,020 DARPA, yeah. 221 00:09:27,020 --> 00:09:28,220 They're the same place. 222 00:09:28,220 --> 00:09:28,740 Later on. 223 00:09:28,740 --> 00:09:29,286 More recent. 224 00:09:29,286 --> 00:09:34,480 Last time it was ARPA was, I think, early '90s. 225 00:09:34,480 --> 00:09:37,300 So by the time I was there, it was DARPA and SSA. 226 00:09:37,300 --> 00:09:43,120 So I was there from '99 through 2001, and what I was doing 227 00:09:43,120 --> 00:09:47,560 there was primarily creating web languages for AI. 228 00:09:47,560 --> 00:09:49,672 So what's now called semantic web, 229 00:09:49,672 --> 00:09:51,130 I did a lot of the funding of that, 230 00:09:51,130 --> 00:09:53,260 and the goal was to try to take some of these ideas 231 00:09:53,260 --> 00:09:56,680 that the AI community had, that had never gotten out 232 00:09:56,680 --> 00:09:59,770 into the real world as it were, and look 233 00:09:59,770 --> 00:10:02,530 at what had worked on the web and try 234 00:10:02,530 --> 00:10:03,910 to see if we could do it. 235 00:10:03,910 --> 00:10:06,240 And in fact, the MIT team-up now comes back 236 00:10:06,240 --> 00:10:08,860 because one of the people who had been very involved 237 00:10:08,860 --> 00:10:11,260 in this early on was Tim Berners-Lee who had invented 238 00:10:11,260 --> 00:10:13,810 the web and was now in CSAIL. 239 00:10:13,810 --> 00:10:16,780 So what are the main goals of the semantic web? 240 00:10:16,780 --> 00:10:19,490 Get this AI knowledge stuff all over the place. 241 00:10:19,490 --> 00:10:24,940 Get it linked together so that what-- 242 00:10:24,940 --> 00:10:26,650 on the web, if I have a document, 243 00:10:26,650 --> 00:10:29,350 and you have a document, my document can point at yours. 244 00:10:29,350 --> 00:10:31,720 So I don't have to necessarily say all the things that 245 00:10:31,720 --> 00:10:32,595 are in your document. 246 00:10:32,595 --> 00:10:33,550 I just point. 247 00:10:33,550 --> 00:10:35,230 I can't do that with my knowledge base. 248 00:10:35,230 --> 00:10:36,740 I can't do that with my database. 249 00:10:36,740 --> 00:10:38,710 So we want to get these other kinds of forms 250 00:10:38,710 --> 00:10:43,360 of data and knowledge linked together, so now, my data 251 00:10:43,360 --> 00:10:46,360 pace can be described by some kind of AI knowledge 252 00:10:46,360 --> 00:10:48,700 representation of ontology, which 253 00:10:48,700 --> 00:10:51,920 can link it, in turn, to other databases, to their ontology. 254 00:10:51,920 --> 00:10:54,130 Like that, so you can start to really view 255 00:10:54,130 --> 00:10:56,290 the world as linking the knowledge together 256 00:10:56,290 --> 00:10:58,180 rather than just the text. 257 00:10:58,180 --> 00:11:00,820 Instead of having to discover that knowledge by reverse 258 00:11:00,820 --> 00:11:03,040 engineering the text in some way, 259 00:11:03,040 --> 00:11:04,480 we can make it much more explicit. 260 00:11:04,480 --> 00:11:05,422 Interesting. 261 00:11:05,422 --> 00:11:11,415 262 00:11:11,415 --> 00:11:12,780 [INAUDIBLE] 263 00:11:12,780 --> 00:11:16,830 No, I was just thinking about the possibilities for that. 264 00:11:16,830 --> 00:11:19,680 It's great, too. 265 00:11:19,680 --> 00:11:23,750 So what do you think the biggest applications 266 00:11:23,750 --> 00:11:27,510 for AI can be in the near future? 267 00:11:27,510 --> 00:11:28,830 Biggest applications for AI? 268 00:11:28,830 --> 00:11:35,220 I think we're going to start seeing a little bit of AI 269 00:11:35,220 --> 00:11:38,100 across very broad swaths of things. 270 00:11:38,100 --> 00:11:40,680 So my slogan for the semantic web work, 271 00:11:40,680 --> 00:11:43,050 before it was even named that, was a little semantics 272 00:11:43,050 --> 00:11:44,560 goes a long way. 273 00:11:44,560 --> 00:11:46,680 And most of the work in the field 274 00:11:46,680 --> 00:11:49,305 is still been trying to build big structured complex 275 00:11:49,305 --> 00:11:51,180 ontologies, but we're starting to find things 276 00:11:51,180 --> 00:11:54,900 like a little frame system for people, 277 00:11:54,900 --> 00:11:56,670 something called Friend of a Friend, 278 00:11:56,670 --> 00:12:00,323 which let you define a few concepts about a person-- 279 00:12:00,323 --> 00:12:02,490 their name, their email, address, things like that-- 280 00:12:02,490 --> 00:12:03,720 and who else they know. 281 00:12:03,720 --> 00:12:05,310 So create a social network. 282 00:12:05,310 --> 00:12:09,030 There are now about six million people with FUD files 283 00:12:09,030 --> 00:12:10,740 known to the world, so here, it's 284 00:12:10,740 --> 00:12:14,760 become the most used AI thing in a simple way, 285 00:12:14,760 --> 00:12:18,510 just by being very simple and spreading virally 286 00:12:18,510 --> 00:12:22,260 rather than our traditional build a big killer app. 287 00:12:22,260 --> 00:12:23,970 And so I think it's the combination 288 00:12:23,970 --> 00:12:25,060 of some of those things. 289 00:12:25,060 --> 00:12:28,920 I think it's the much simpler AI across a much, much wider-- 290 00:12:28,920 --> 00:12:32,805 From batch processing to the micro-computer. 291 00:12:32,805 --> 00:12:35,340 No, from centralized processing to the web. 292 00:12:35,340 --> 00:12:36,720 That's what I meant. 293 00:12:36,720 --> 00:12:39,648 To the web, and just think of the web of knowledge 294 00:12:39,648 --> 00:12:40,440 and knowledge ripe. 295 00:12:40,440 --> 00:12:44,010 That's an idea AI has never played with before-- 296 00:12:44,010 --> 00:12:46,200 linking things together. 297 00:12:46,200 --> 00:12:49,990 My knowledge base about cat can point at your concept about cat 298 00:12:49,990 --> 00:12:51,690 and say, these are the same thing. 299 00:12:51,690 --> 00:12:53,587 All of a sudden, all the stuff you've built 300 00:12:53,587 --> 00:12:55,170 and all the stuff I've built can start 301 00:12:55,170 --> 00:12:58,810 getting that network effect that decentralizes, opens, 302 00:12:58,810 --> 00:12:59,310 et cetera. 303 00:12:59,310 --> 00:13:00,580 But you lose a lot of things. 304 00:13:00,580 --> 00:13:03,150 You can't keep this thing consistent. 305 00:13:03,150 --> 00:13:05,130 What happens if I'm pointing at your knowledge, 306 00:13:05,130 --> 00:13:06,850 and the server's down? 307 00:13:06,850 --> 00:13:09,138 So a lot of things we haven't worked on in AI, 308 00:13:09,138 --> 00:13:10,680 because our knowledge base has always 309 00:13:10,680 --> 00:13:14,640 been clean and central and carefully designed, 310 00:13:14,640 --> 00:13:16,200 become big problems. 311 00:13:16,200 --> 00:13:19,380 But big challenges and opportunities, 312 00:13:19,380 --> 00:13:21,340 much like on the web, a 404 error 313 00:13:21,340 --> 00:13:23,610 makes it possible to scale. 314 00:13:23,610 --> 00:13:27,930 You can't not have links to things that go away 315 00:13:27,930 --> 00:13:29,507 and still be scalable. 316 00:13:29,507 --> 00:13:31,590 And so what do you think the most important skills 317 00:13:31,590 --> 00:13:34,478 for someone, a new researcher in the field, are? 318 00:13:34,478 --> 00:13:36,270 I think they're still the skills that we've 319 00:13:36,270 --> 00:13:40,380 had for a long time-- a strong background across the field, 320 00:13:40,380 --> 00:13:45,900 a understanding of some depth of some or all of the areas 321 00:13:45,900 --> 00:13:50,340 that they're interested in, and personally, I think, 322 00:13:50,340 --> 00:13:51,120 creativity. 323 00:13:51,120 --> 00:13:56,730 I think we keep trying to stamp that out of AI and failing, 324 00:13:56,730 --> 00:14:01,710 and I think that's great because no matter how mathematical you 325 00:14:01,710 --> 00:14:05,220 get, there's these big hard problems that take somebody 326 00:14:05,220 --> 00:14:08,610 really just saying, let's go after it. 327 00:14:08,610 --> 00:14:10,920 And that just makes for great, fun work, 328 00:14:10,920 --> 00:14:13,780 and I wish we had more of that going sometimes. 329 00:14:13,780 --> 00:14:15,023 Well, thank you very much. 330 00:14:15,023 --> 00:14:15,790 Thank you. 331 00:14:15,790 --> 00:14:20,544 332 00:14:20,544 --> 00:14:24,080 Cool, so do we have.