1 00:00:00,000 --> 00:00:04,550 2 00:00:04,550 --> 00:00:05,540 I'm Alan Bundy. 3 00:00:05,540 --> 00:00:08,180 I'm Professor of Automated Reasoning at the University 4 00:00:08,180 --> 00:00:09,670 of Edinburgh in Scotland. 5 00:00:09,670 --> 00:00:10,670 Great, cool. 6 00:00:10,670 --> 00:00:13,170 Thank you for coming. 7 00:00:13,170 --> 00:00:14,740 So how did you get into the field 8 00:00:14,740 --> 00:00:16,420 of artificial intelligence? 9 00:00:16,420 --> 00:00:19,120 I went to Edinburgh in '71 having 10 00:00:19,120 --> 00:00:23,890 done a PhD in mathematical logic in Leicester in England, 11 00:00:23,890 --> 00:00:30,340 and I got very unhappy with my research in logic. 12 00:00:30,340 --> 00:00:32,110 I felt that it was-- 13 00:00:32,110 --> 00:00:35,410 it kind of lost its motivation, and I 14 00:00:35,410 --> 00:00:37,670 was looking for something more application oriented. 15 00:00:37,670 --> 00:00:41,350 And I discovered about the work at Edinburgh 16 00:00:41,350 --> 00:00:44,530 on automated reasoning, which was using the kind of logic 17 00:00:44,530 --> 00:00:46,990 that I'd been learning about, so this seemed 18 00:00:46,990 --> 00:00:49,040 like an ideal opportunity. 19 00:00:49,040 --> 00:00:53,270 So I applied for a research fellow position at Edinburgh. 20 00:00:53,270 --> 00:00:55,990 Went to Edinburgh to work under Bernard Meltzer, who 21 00:00:55,990 --> 00:00:58,700 was running automated theorem [INAUDIBLE] in Edinburgh. 22 00:00:58,700 --> 00:00:59,200 Cool. 23 00:00:59,200 --> 00:01:00,575 What other kinds of research were 24 00:01:00,575 --> 00:01:02,740 going on in that lab at the time? 25 00:01:02,740 --> 00:01:06,030 In the whole of Edinburgh, there were four main groups, 26 00:01:06,030 --> 00:01:10,330 so there was one run by Donald Michie, 27 00:01:10,330 --> 00:01:13,823 and I guess the major research there was the Freddy the Robot 28 00:01:13,823 --> 00:01:16,240 project, which I think Harry Barron spoke to about earlier 29 00:01:16,240 --> 00:01:17,090 today. 30 00:01:17,090 --> 00:01:20,410 So I probably don't need to dwell on that too much. 31 00:01:20,410 --> 00:01:24,130 There was a bionics research lab, which 32 00:01:24,130 --> 00:01:26,390 was then run by Jim Howard. 33 00:01:26,390 --> 00:01:28,630 Previously been run by Richard Gregory, who's 34 00:01:28,630 --> 00:01:33,430 famous for his work on optical illusions, but he left by then. 35 00:01:33,430 --> 00:01:37,750 And its main interest was in education, computers 36 00:01:37,750 --> 00:01:38,890 and education. 37 00:01:38,890 --> 00:01:40,510 It'd done some work on touchscreens, 38 00:01:40,510 --> 00:01:43,420 but the big project that they started working on soon after I 39 00:01:43,420 --> 00:01:45,340 arrived there was on Logo. 40 00:01:45,340 --> 00:01:48,010 So they were following the MIT work on Logo, 41 00:01:48,010 --> 00:01:52,180 but they were principally interested in the evaluation 42 00:01:52,180 --> 00:01:54,070 of it in an educational context. 43 00:01:54,070 --> 00:01:56,641 So we had a lot of school kids running around in the lab. 44 00:01:56,641 --> 00:01:57,610 Were there turtles? 45 00:01:57,610 --> 00:02:00,410 With turtles and so on, and programming the computer, 46 00:02:00,410 --> 00:02:02,140 and so on. 47 00:02:02,140 --> 00:02:04,898 Then there was another group, the theoretical unit run 48 00:02:04,898 --> 00:02:06,190 by Christopher Longuet-Higgins. 49 00:02:06,190 --> 00:02:09,840 Its main interest there was in language, 50 00:02:09,840 --> 00:02:13,360 so it did a lot of contextual linguistic stuff. 51 00:02:13,360 --> 00:02:15,310 But they also had an interest in neural nets, 52 00:02:15,310 --> 00:02:17,680 and in fact Geoffrey Hinton, for instance, 53 00:02:17,680 --> 00:02:19,660 was there as a student at that time. 54 00:02:19,660 --> 00:02:22,587 Lots of people who've gone on to be big names in the field 55 00:02:22,587 --> 00:02:24,420 actually were in Edinburgh around that time. 56 00:02:24,420 --> 00:02:26,615 Was this the Edinburgh Artificial Intelligence 57 00:02:26,615 --> 00:02:27,590 Laboratory? 58 00:02:27,590 --> 00:02:29,230 Well, it was actually then called 59 00:02:29,230 --> 00:02:32,860 the Department of Machine Intelligence and Perception, 60 00:02:32,860 --> 00:02:37,810 and it changed its name in 74 to the Department 61 00:02:37,810 --> 00:02:39,610 of Artificial Intelligence. 62 00:02:39,610 --> 00:02:42,250 But it always been called machine intelligence. 63 00:02:42,250 --> 00:02:44,620 About how many people were there? 64 00:02:44,620 --> 00:02:47,810 Oh, I guess it depends who you count. 65 00:02:47,810 --> 00:02:51,520 There were, in terms of tenured members of staff, 66 00:02:51,520 --> 00:02:53,590 about half a dozen, but then there 67 00:02:53,590 --> 00:02:56,140 were quite of our research fellows and PhD students. 68 00:02:56,140 --> 00:03:01,255 So maybe 30, 40 people altogether. 69 00:03:01,255 --> 00:03:03,130 I know, in Minsky's room, they had a playpen. 70 00:03:03,130 --> 00:03:06,400 Did you have any similar installment? 71 00:03:06,400 --> 00:03:09,290 No, not really, no, but we did have-- 72 00:03:09,290 --> 00:03:13,660 yeah, we did have the Freddy project, Freddy robot. 73 00:03:13,660 --> 00:03:15,850 My own group, as I said, was concerned 74 00:03:15,850 --> 00:03:20,290 with automated reasoning, and at that time, 75 00:03:20,290 --> 00:03:22,630 I said, there was some very notable people there, 76 00:03:22,630 --> 00:03:25,240 so that when I joined the group, Pat Hayes was there, 77 00:03:25,240 --> 00:03:28,420 who went on to become world famous for his work 78 00:03:28,420 --> 00:03:29,840 on knowledge representation. 79 00:03:29,840 --> 00:03:31,450 He'd already written a seminal paper 80 00:03:31,450 --> 00:03:34,240 with John McCarthy, McCarthy [INAUDIBLE] 81 00:03:34,240 --> 00:03:37,060 who's one of the seminal papers in the field. 82 00:03:37,060 --> 00:03:39,400 Bob Kowalski was there, who went on 83 00:03:39,400 --> 00:03:41,920 to become the founder of the logic programming 84 00:03:41,920 --> 00:03:46,010 and famous for his work in Prolog 85 00:03:46,010 --> 00:03:47,620 and logic programming generally. 86 00:03:47,620 --> 00:03:49,840 And Bob Boyer and Jane Moore were there, 87 00:03:49,840 --> 00:03:53,070 so they were doing pioneering work on the inductive theory 88 00:03:53,070 --> 00:03:57,250 proving and its application to formal verification. 89 00:03:57,250 --> 00:03:58,990 So that went on. 90 00:03:58,990 --> 00:04:03,490 They just, in fact, won the ACM Systems award for that work 91 00:04:03,490 --> 00:04:05,350 after all these years, so that was another. 92 00:04:05,350 --> 00:04:11,680 That started in about '73 at Edinburgh. 93 00:04:11,680 --> 00:04:15,040 And my own work then had two themes. 94 00:04:15,040 --> 00:04:17,637 One was on meta level inference and the way 95 00:04:17,637 --> 00:04:19,720 in which one could reason about reasoning in order 96 00:04:19,720 --> 00:04:21,070 to guide reasoning. 97 00:04:21,070 --> 00:04:24,940 And the other one was about how one forms representations 98 00:04:24,940 --> 00:04:27,940 of knowledge automatically from, let's 99 00:04:27,940 --> 00:04:31,450 say, a description in English and then goes on 100 00:04:31,450 --> 00:04:32,620 to use that for reasoning. 101 00:04:32,620 --> 00:04:35,710 That's been the dominant theme of my work ever since. 102 00:04:35,710 --> 00:04:39,430 When you're looking for inspiration 103 00:04:39,430 --> 00:04:42,500 about how to develop the field, did you look to philosophy? 104 00:04:42,500 --> 00:04:44,470 What other things were-- 105 00:04:44,470 --> 00:04:46,240 Actually, I was just thinking about this 106 00:04:46,240 --> 00:04:47,830 because I know you were particularly 107 00:04:47,830 --> 00:04:50,990 interested in the MIT AI Lab at this time. 108 00:04:50,990 --> 00:04:55,820 So actually, one of the major influences was the MIT AI Lab, 109 00:04:55,820 --> 00:05:00,580 but it was kind of controversial because there would been-- 110 00:05:00,580 --> 00:05:04,810 our group had been founded on resolution theorem proving, 111 00:05:04,810 --> 00:05:07,120 and Robinson was a frequent visitor to our group, 112 00:05:07,120 --> 00:05:12,380 and that had become very fashionable for a while in AI. 113 00:05:12,380 --> 00:05:16,940 But Minsky and others at MIT had been extremely critical 114 00:05:16,940 --> 00:05:20,210 of that whole approach, and I think a lot of what we did 115 00:05:20,210 --> 00:05:21,930 was actually in reaction to that. 116 00:05:21,930 --> 00:05:28,550 So if you look at Pat's work on common sense reasoning 117 00:05:28,550 --> 00:05:30,740 you can see him being heavily influenced 118 00:05:30,740 --> 00:05:34,160 by that critique of logic, which he set out to defend, 119 00:05:34,160 --> 00:05:36,995 but also it influenced his approach 120 00:05:36,995 --> 00:05:38,120 in a very constructive way. 121 00:05:38,120 --> 00:05:39,920 It certainly influenced my approach. 122 00:05:39,920 --> 00:05:41,930 The whole meta level reasoning thing 123 00:05:41,930 --> 00:05:47,030 was a way of trying to get a handle on how you would guide 124 00:05:47,030 --> 00:05:49,730 reasoning processes, which is one of the major criticisms, 125 00:05:49,730 --> 00:05:54,090 that these resolution systems were just completely unguided. 126 00:05:54,090 --> 00:05:57,350 I think the Boyer-Moore approach was similarly, in that respect, 127 00:05:57,350 --> 00:06:00,020 looking for heuristics as a way of guiding reasoning, 128 00:06:00,020 --> 00:06:06,080 and certainly, Bob Kowalski's promotion 129 00:06:06,080 --> 00:06:07,940 of logic as a programming language 130 00:06:07,940 --> 00:06:10,910 rather than as a common sense reasoning engine, 131 00:06:10,910 --> 00:06:13,110 I think, it was also, in a way, influenced by that. 132 00:06:13,110 --> 00:06:16,390 So I think that was a major driver, in a way. 133 00:06:16,390 --> 00:06:18,140 It was a major point of discussion, 134 00:06:18,140 --> 00:06:20,810 that whole criticism, from what we saw at the MIT school 135 00:06:20,810 --> 00:06:21,710 at the time. 136 00:06:21,710 --> 00:06:22,210 Cool. 137 00:06:22,210 --> 00:06:25,430 And then how did you know about what was going on 138 00:06:25,430 --> 00:06:26,450 and the alternate labs? 139 00:06:26,450 --> 00:06:27,962 Was it more through publications? 140 00:06:27,962 --> 00:06:29,045 Oh, no, we met these guys. 141 00:06:29,045 --> 00:06:30,080 Conferences? 142 00:06:30,080 --> 00:06:31,670 The field was very small. 143 00:06:31,670 --> 00:06:34,788 Yeah, so IJCAI conferences, for instance, 144 00:06:34,788 --> 00:06:36,080 which happened every two years. 145 00:06:36,080 --> 00:06:39,368 They started in '69, so one of the first conferences-- 146 00:06:39,368 --> 00:06:40,910 actually, the first conference I ever 147 00:06:40,910 --> 00:06:43,970 went to is the IJCAI which was in London in '71. 148 00:06:43,970 --> 00:06:46,150 So people came from all over the world, 149 00:06:46,150 --> 00:06:50,300 so I would meet most of the US researchers and so on there. 150 00:06:50,300 --> 00:06:52,100 So in that first meeting, where from 151 00:06:52,100 --> 00:06:53,440 the world were people coming-- 152 00:06:53,440 --> 00:06:54,510 Oh, all over the world. 153 00:06:54,510 --> 00:06:57,768 The trip-- I guess, apart from Britain, 154 00:06:57,768 --> 00:06:59,810 because the conference was being held in Britain, 155 00:06:59,810 --> 00:07:02,030 and of course, there'd be a lot of British people, 156 00:07:02,030 --> 00:07:04,220 the other major place was in the US. 157 00:07:04,220 --> 00:07:06,810 So at the time that the Edinburgh lab was founded, 158 00:07:06,810 --> 00:07:11,300 which Michie came in '63, and the Department of Machine 159 00:07:11,300 --> 00:07:14,600 Intelligence and Perception was soon after that, 160 00:07:14,600 --> 00:07:15,987 I think in '67. 161 00:07:15,987 --> 00:07:17,570 So around that time, there were really 162 00:07:17,570 --> 00:07:21,320 only four labs in the world that were doing AI. 163 00:07:21,320 --> 00:07:23,120 I mean, there were small groups elsewhere, 164 00:07:23,120 --> 00:07:26,720 but the big labs were MIT, Stanford, Carnegie Mellon, 165 00:07:26,720 --> 00:07:27,560 and Edinburgh. 166 00:07:27,560 --> 00:07:32,050 And so we had very close contacts with all those people. 167 00:07:32,050 --> 00:07:33,790 There was a lot travel. 168 00:07:33,790 --> 00:07:36,470 In '73, IJCAI was in Stanford. 169 00:07:36,470 --> 00:07:37,610 I went across to that. 170 00:07:37,610 --> 00:07:40,190 I think there was about 300 people there, 171 00:07:40,190 --> 00:07:42,170 so you more or less knew everybody, even 172 00:07:42,170 --> 00:07:45,510 people in other fields like language or vision, 173 00:07:45,510 --> 00:07:47,930 which I wasn't directly involved in the research. 174 00:07:47,930 --> 00:07:49,800 I would know a lot of people there. 175 00:07:49,800 --> 00:07:53,480 And we had particularly strong relations 176 00:07:53,480 --> 00:07:54,750 with some of the other labs. 177 00:07:54,750 --> 00:07:59,540 So I would say the strongest was probably with Stanford and SRI 178 00:07:59,540 --> 00:08:02,240 because they shared a very similar logic oriented 179 00:08:02,240 --> 00:08:03,620 approach to the field. 180 00:08:03,620 --> 00:08:06,830 And so with McCarthy and Nils Nilsson, particularly 181 00:08:06,830 --> 00:08:10,130 on the Shakey project, a lot of interaction between the Shakey 182 00:08:10,130 --> 00:08:12,050 project and the Freddy project. 183 00:08:12,050 --> 00:08:14,520 Yeah, so, is there a lot-- 184 00:08:14,520 --> 00:08:19,260 how's the atmosphere of Edinburgh changed since then? 185 00:08:19,260 --> 00:08:23,030 Oh, you mean like the social atmosphere, 186 00:08:23,030 --> 00:08:24,640 or the academic atmosphere? 187 00:08:24,640 --> 00:08:25,560 Both. 188 00:08:25,560 --> 00:08:27,320 Well, socially, Edinburgh at that time 189 00:08:27,320 --> 00:08:30,060 was famous for its arguments amongst the professors. 190 00:08:30,060 --> 00:08:33,960 So there was a kind of war amongst the senior professors. 191 00:08:33,960 --> 00:08:37,190 It was very well known and highly controversial, 192 00:08:37,190 --> 00:08:40,730 and it led to exciting times in the Chinese curse 193 00:08:40,730 --> 00:08:42,169 sense of that word. 194 00:08:42,169 --> 00:08:44,730 I think-- why did it happen? 195 00:08:44,730 --> 00:08:46,460 Yeah. 196 00:08:46,460 --> 00:08:49,280 I think it takes a certain kind of person 197 00:08:49,280 --> 00:08:52,100 to set up a totally new field, people 198 00:08:52,100 --> 00:08:54,680 with very strong wills, very strong ideas about where they 199 00:08:54,680 --> 00:09:01,970 want the field to go, a lot of hutzpah, as it were, to create, 200 00:09:01,970 --> 00:09:06,770 persuade people that something new needs to be set up. 201 00:09:06,770 --> 00:09:09,280 And I mean, the AI department, or the Machine Intelligence 202 00:09:09,280 --> 00:09:12,200 and Perception department was set up de novo at Edinburgh. 203 00:09:12,200 --> 00:09:14,420 I mean, most new departments spin out 204 00:09:14,420 --> 00:09:17,810 from other departments, but here is something completely new. 205 00:09:17,810 --> 00:09:20,510 And so it took a certain, very strong personality, 206 00:09:20,510 --> 00:09:23,098 and Donald Michie had gathered together 207 00:09:23,098 --> 00:09:25,015 some very strong people, like Richard Gregory, 208 00:09:25,015 --> 00:09:26,270 like Christopher Longuet-Higgins. 209 00:09:26,270 --> 00:09:27,950 And Bernard Meltzer and so on, to set up 210 00:09:27,950 --> 00:09:29,552 these different groups. 211 00:09:29,552 --> 00:09:33,090 And they were all strong personalities, 212 00:09:33,090 --> 00:09:34,340 and they didn't always get on. 213 00:09:34,340 --> 00:09:37,040 214 00:09:37,040 --> 00:09:39,380 Things settled down in '74, when we 215 00:09:39,380 --> 00:09:41,840 created the Department of Artificial Intelligence 216 00:09:41,840 --> 00:09:45,150 but before that, it was a lot of argument. 217 00:09:45,150 --> 00:09:50,730 So now, it seems to me much smoother 218 00:09:50,730 --> 00:09:52,230 because I lived through those times. 219 00:09:52,230 --> 00:09:58,160 But technically, I mean, it was a very exciting time 220 00:09:58,160 --> 00:10:01,790 because everything was new, and people were trying out 221 00:10:01,790 --> 00:10:03,450 all sorts of different approaches, 222 00:10:03,450 --> 00:10:06,110 and there was lots of discussion about that. 223 00:10:06,110 --> 00:10:12,510 I think people were very exploratory. 224 00:10:12,510 --> 00:10:19,020 So now, there are lots of well recognized mechanisms 225 00:10:19,020 --> 00:10:21,390 that have been developed within the field 226 00:10:21,390 --> 00:10:24,313 and have been theoretically explored, 227 00:10:24,313 --> 00:10:25,980 and so the properties of them are known. 228 00:10:25,980 --> 00:10:27,990 There are very good implementations 229 00:10:27,990 --> 00:10:32,130 which have been well tested, and if you build a system nowadays, 230 00:10:32,130 --> 00:10:34,440 you build it out of components, which 231 00:10:34,440 --> 00:10:37,655 you will get a lot of those from elsewhere 232 00:10:37,655 --> 00:10:42,540 and put them together, and you won't do much sort of de 233 00:10:42,540 --> 00:10:44,670 novo programming yourself. 234 00:10:44,670 --> 00:10:47,310 I mean, there'll be some, but you will 235 00:10:47,310 --> 00:10:49,470 build on other people's work. 236 00:10:49,470 --> 00:10:51,793 Then, everything was new, so there 237 00:10:51,793 --> 00:10:52,960 was very little to build on. 238 00:10:52,960 --> 00:10:55,380 So more or less, everybody started from scratch, 239 00:10:55,380 --> 00:10:58,980 and some of the programming was very ad hoc and rough, 240 00:10:58,980 --> 00:11:01,630 and the mechanisms were not well understood. 241 00:11:01,630 --> 00:11:06,120 And so the programs were much briefer things 242 00:11:06,120 --> 00:11:09,480 because things were poorly understood. 243 00:11:09,480 --> 00:11:11,870 Things would just break down a whole lot more. 244 00:11:11,870 --> 00:11:17,790 So I remember reading a paper about the [INAUDIBLE] system 245 00:11:17,790 --> 00:11:21,570 which had been built, I guess, by the the [INAUDIBLE] 246 00:11:21,570 --> 00:11:24,105 people mostly in the early '70s. 247 00:11:24,105 --> 00:11:25,980 And there was this wonderful quote that said, 248 00:11:25,980 --> 00:11:28,440 most AI systems simulate intelligence 249 00:11:28,440 --> 00:11:30,570 only on the example in the thesis. 250 00:11:30,570 --> 00:11:33,150 On another example, they simulate total failure 251 00:11:33,150 --> 00:11:34,320 or death. 252 00:11:34,320 --> 00:11:37,240 And that is very true, and I think it's not true now. 253 00:11:37,240 --> 00:11:38,958 I think systems are much more robust-- 254 00:11:38,958 --> 00:11:41,250 Do you think we're going to see artificial intelligence 255 00:11:41,250 --> 00:11:44,340 majors come up in the undergraduate program? 256 00:11:44,340 --> 00:11:45,540 Oh, we've been-- 257 00:11:45,540 --> 00:11:46,800 As a common feature? 258 00:11:46,800 --> 00:11:50,218 We've been teaching undergraduates since 1974 259 00:11:50,218 --> 00:11:50,760 at Edinburgh. 260 00:11:50,760 --> 00:11:51,410 We had a-- 261 00:11:51,410 --> 00:11:52,560 They can major in AI? 262 00:11:52,560 --> 00:11:53,140 Oh, yeah. 263 00:11:53,140 --> 00:11:53,640 Oh, great. 264 00:11:53,640 --> 00:11:56,580 Absolutely, yeah, so we had-- 265 00:11:56,580 --> 00:12:00,990 I was actually responsible for organizing the very first 266 00:12:00,990 --> 00:12:02,290 undergraduate course. 267 00:12:02,290 --> 00:12:05,880 There was a second year, a sophomore course in AI. 268 00:12:05,880 --> 00:12:09,190 It was just a specialized course, but within a few years, 269 00:12:09,190 --> 00:12:11,310 we had joint degrees. 270 00:12:11,310 --> 00:12:13,800 Our first joint degree was actually with linguistics. 271 00:12:13,800 --> 00:12:16,440 At Edinburgh, we've always had very strong connections 272 00:12:16,440 --> 00:12:19,050 between the AI work and the linguistics department, 273 00:12:19,050 --> 00:12:21,750 and lots of work on computational linguistics. 274 00:12:21,750 --> 00:12:23,238 So that was our first joint degree, 275 00:12:23,238 --> 00:12:25,530 and then we had joint degrees with the computer science 276 00:12:25,530 --> 00:12:27,720 department, which was a separate department then, 277 00:12:27,720 --> 00:12:33,382 and with mathematics and then with psychology and so on. 278 00:12:33,382 --> 00:12:36,640 And now, we have a single subject AI degree. 279 00:12:36,640 --> 00:12:38,725 Wow, what does it consist of? 280 00:12:38,725 --> 00:12:40,350 Oh, well, there's so much material now. 281 00:12:40,350 --> 00:12:41,975 I mean, that's one of the things that's 282 00:12:41,975 --> 00:12:43,480 happened in the last 30 years. 283 00:12:43,480 --> 00:12:46,260 Since I've been in the field, so much has happened, 284 00:12:46,260 --> 00:12:48,420 so there's no shortage of material. 285 00:12:48,420 --> 00:12:51,840 I mean, we teach-- in our second year, 286 00:12:51,840 --> 00:12:57,720 we use the Russell Norvig book and maybe use culture of it. 287 00:12:57,720 --> 00:13:00,120 [INTERPOSING VOICES] 288 00:13:00,120 --> 00:13:02,490 You could probably get most of the contents 289 00:13:02,490 --> 00:13:04,785 of a degree [INAUDIBLE] and that was 290 00:13:04,785 --> 00:13:07,500 the whole of that book, a tremendous amount [INAUDIBLE].. 291 00:13:07,500 --> 00:13:09,097 Why don't I just take [INAUDIBLE].. 292 00:13:09,097 --> 00:13:09,930 I'll give it to him. 293 00:13:09,930 --> 00:13:12,210 You'll get something back if you just tell me who-- 294 00:13:12,210 --> 00:13:13,420 We'll just cut this part. 295 00:13:13,420 --> 00:13:13,920 OK. 296 00:13:13,920 --> 00:13:15,720 Do you want to-- do you want to stop? 297 00:13:15,720 --> 00:13:17,160 No, I'll give you a-- 298 00:13:17,160 --> 00:13:20,205 I have one more question. 299 00:13:20,205 --> 00:13:23,620 Do you guys want to do that out in the hallway maybe, Matt? 300 00:13:23,620 --> 00:13:26,550 301 00:13:26,550 --> 00:13:30,030 And then we can just close and cut this part. 302 00:13:30,030 --> 00:13:31,930 In fact I think I have Tom's card. 303 00:13:31,930 --> 00:13:35,047 He gave it to me. 304 00:13:35,047 --> 00:13:36,630 We're waiting for you because we can't 305 00:13:36,630 --> 00:13:37,890 record while you're talking. 306 00:13:37,890 --> 00:13:42,040 I think-- OK, OK, I'll remember Tom and I have his direction. 307 00:13:42,040 --> 00:13:44,520 OK, the light is directional. 308 00:13:44,520 --> 00:13:47,880 OK, so you'll get it. 309 00:13:47,880 --> 00:13:48,380 Sorry. 310 00:13:48,380 --> 00:13:51,200 311 00:13:51,200 --> 00:13:53,010 So what are you-- 312 00:13:53,010 --> 00:13:56,330 what are your hopes for the future, just to close out? 313 00:13:56,330 --> 00:13:59,030 So I've got very interested again 314 00:13:59,030 --> 00:14:04,730 in this area of representational formation and evolution 315 00:14:04,730 --> 00:14:06,420 and repair. 316 00:14:06,420 --> 00:14:08,810 So I'm getting very excited about that at the moment. 317 00:14:08,810 --> 00:14:10,340 I think it's a very neglected area, 318 00:14:10,340 --> 00:14:13,160 so I think there's lots of interesting things to do. 319 00:14:13,160 --> 00:14:15,470 It's had a very liberating effect on my own thinking 320 00:14:15,470 --> 00:14:16,970 because all kinds of issues, which 321 00:14:16,970 --> 00:14:20,225 I thought were hard or intractable, become-- 322 00:14:20,225 --> 00:14:22,100 you just got a different perspective on them, 323 00:14:22,100 --> 00:14:24,140 and they become much more tractable. 324 00:14:24,140 --> 00:14:26,660 So that's an exciting area that I 325 00:14:26,660 --> 00:14:29,960 want to push on myself in the next few years. 326 00:14:29,960 --> 00:14:30,860 Great. 327 00:14:30,860 --> 00:14:32,940 Thank you so much.