1 00:00:00,000 --> 00:00:03,050 2 00:00:03,050 --> 00:00:04,770 He wasn't taping yet. 3 00:00:04,770 --> 00:00:07,600 OK, so thank you for coming. 4 00:00:07,600 --> 00:00:08,880 My pleasure. 5 00:00:08,880 --> 00:00:12,150 So why don't you tell me what you're doing right now? 6 00:00:12,150 --> 00:00:15,660 What I'm doing right now in the field of AI 7 00:00:15,660 --> 00:00:19,590 is concerned with cases and examples 8 00:00:19,590 --> 00:00:22,950 and how one reasons with them. 9 00:00:22,950 --> 00:00:27,840 Particularly interested in the problem of how 10 00:00:27,840 --> 00:00:31,590 one reasons with concepts that are messy in the sense 11 00:00:31,590 --> 00:00:35,150 that they're not well-defined say by logical definition 12 00:00:35,150 --> 00:00:37,200 to necessary intuition conditions, 13 00:00:37,200 --> 00:00:40,080 concepts that have exceptions and holes in them, 14 00:00:40,080 --> 00:00:43,230 and concepts that change, and the role 15 00:00:43,230 --> 00:00:45,630 of examples in cases in representing 16 00:00:45,630 --> 00:00:47,480 those sorts of things. 17 00:00:47,480 --> 00:00:48,750 OK. 18 00:00:48,750 --> 00:00:50,760 That's what drives me. 19 00:00:50,760 --> 00:00:53,310 I love examples and cases and hypotheticals. 20 00:00:53,310 --> 00:00:56,610 Fabulous, and how did you get into the field originally? 21 00:00:56,610 --> 00:01:00,930 Well, came up pretty classically through the mathematics route. 22 00:01:00,930 --> 00:01:04,260 I was an undergraduate mathematics major at Brown 23 00:01:04,260 --> 00:01:06,480 and I loved mathematics. 24 00:01:06,480 --> 00:01:09,420 I also like thinking about the problem of how 25 00:01:09,420 --> 00:01:13,260 you get good at mathematics but nonetheless, eventually ended 26 00:01:13,260 --> 00:01:18,420 up in the MIT math department as a regular graduate student. 27 00:01:18,420 --> 00:01:21,270 And at some point got sort of bitchy 28 00:01:21,270 --> 00:01:23,760 about continuing to be a graduate student. 29 00:01:23,760 --> 00:01:29,850 And took a year off to go work at Lincoln Labs supposedly 30 00:01:29,850 --> 00:01:35,730 to get some experience on stability problems etc. 31 00:01:35,730 --> 00:01:39,870 but when I got to the labs I ended up being put on a project 32 00:01:39,870 --> 00:01:41,790 that I found quite boring. 33 00:01:41,790 --> 00:01:43,890 And I resorted to the sorts of things 34 00:01:43,890 --> 00:01:45,962 I did high school which was go to the library 35 00:01:45,962 --> 00:01:47,920 and think about things I wanted to think about. 36 00:01:47,920 --> 00:01:48,690 How terrible. 37 00:01:48,690 --> 00:01:49,900 How terrible. 38 00:01:49,900 --> 00:01:54,420 And I proposed a project that would essentially 39 00:01:54,420 --> 00:01:57,990 build what we would call today Mathematicians Workbench, that 40 00:01:57,990 --> 00:02:01,710 would be a repository for information like theorems 41 00:02:01,710 --> 00:02:05,760 and definitions and counterexamples in mathematics 42 00:02:05,760 --> 00:02:09,360 and all the interconnections amongst them 43 00:02:09,360 --> 00:02:11,380 so that you could ask questions. 44 00:02:11,380 --> 00:02:14,970 Well, if I'm looking at this theorem can 45 00:02:14,970 --> 00:02:17,070 you tell me whether I can drop a precondition 46 00:02:17,070 --> 00:02:20,295 and if so, point me to the theorem and the proof, if not 47 00:02:20,295 --> 00:02:21,420 show me the counterexample. 48 00:02:21,420 --> 00:02:22,200 Is this a program? 49 00:02:22,200 --> 00:02:23,580 Oh by the way-- 50 00:02:23,580 --> 00:02:32,040 I wanted it-- it's essentially a hypertext web-like environment 51 00:02:32,040 --> 00:02:36,150 that would let you navigate very, very 52 00:02:36,150 --> 00:02:41,610 freely among this highly interconnected knowledge 53 00:02:41,610 --> 00:02:42,780 base of mathematics. 54 00:02:42,780 --> 00:02:46,860 And so I wanted to first decide what should be in the knowledge 55 00:02:46,860 --> 00:02:50,190 base and then build a program that would let you 56 00:02:50,190 --> 00:02:54,210 you know, swing around it very freely. 57 00:02:54,210 --> 00:02:57,810 And that's how it started off and I proposed that 58 00:02:57,810 --> 00:02:59,400 to my supervisors at MIT-- 59 00:02:59,400 --> 00:03:00,360 at Lincoln Labs. 60 00:03:00,360 --> 00:03:03,490 And they said, well no, we're not funded to let you do that. 61 00:03:03,490 --> 00:03:07,590 But back on campus some people in the math department, 62 00:03:07,590 --> 00:03:09,270 particularly they chair, heard about it 63 00:03:09,270 --> 00:03:11,790 and said, why don't you go talk to Marvin and Seymour 64 00:03:11,790 --> 00:03:13,350 about this idea. 65 00:03:13,350 --> 00:03:16,210 And Seymour in particular who's a professor, 66 00:03:16,210 --> 00:03:18,840 Seymour Papert who's a professor of mathematics 67 00:03:18,840 --> 00:03:22,345 said, well, why don't you come back to do this as your thesis? 68 00:03:22,345 --> 00:03:23,790 69 00:03:23,790 --> 00:03:24,900 What year was this? 70 00:03:24,900 --> 00:03:35,670 This was-- I was at Lincoln Labs in '74-'75 and so I was having 71 00:03:35,670 --> 00:03:38,160 these conversations with them I guess the summer 72 00:03:38,160 --> 00:03:40,752 of '74, the fall of '74. 73 00:03:40,752 --> 00:03:42,210 I'd have to check my dates exactly. 74 00:03:42,210 --> 00:03:44,760 Were they both the-- heading the AI lab at the time? 75 00:03:44,760 --> 00:03:48,780 Yes, but I didn't know them so much through the AI lab 76 00:03:48,780 --> 00:03:52,035 as through my home base in the math department. 77 00:03:52,035 --> 00:03:53,220 OK. 78 00:03:53,220 --> 00:03:56,520 There's an awful lot of AI that has happened 79 00:03:56,520 --> 00:03:58,620 in the math department at MIT. 80 00:03:58,620 --> 00:04:00,310 There's a really strong tradition. 81 00:04:00,310 --> 00:04:02,670 And so-- and of the mathematicians 82 00:04:02,670 --> 00:04:05,710 there are not scared to think about things in new ways 83 00:04:05,710 --> 00:04:08,160 and so in fact the person who probably pointed me off 84 00:04:08,160 --> 00:04:13,390 in this direction is a classical functional analyst type. 85 00:04:13,390 --> 00:04:16,779 And so, so anyway, Seymour said, why don't you 86 00:04:16,779 --> 00:04:20,500 come back to campus and do this as your thesis? 87 00:04:20,500 --> 00:04:22,570 And whereas I thought I was going to do 88 00:04:22,570 --> 00:04:23,720 a thesis on something else. 89 00:04:23,720 --> 00:04:26,620 And I thought gee, my hobby is my thesis, 90 00:04:26,620 --> 00:04:28,210 what could be better? 91 00:04:28,210 --> 00:04:34,390 And so it's essentially a swing over to the AI side which is-- 92 00:04:34,390 --> 00:04:37,180 now you would call it sort of an [INAUDIBLE] representation 93 00:04:37,180 --> 00:04:39,710 or an environmental kind of thing. 94 00:04:39,710 --> 00:04:44,920 And one of the important things I thought 95 00:04:44,920 --> 00:04:48,700 to capture in this environment were the examples, 96 00:04:48,700 --> 00:04:52,270 how one counterexample or one example is build from another. 97 00:04:52,270 --> 00:04:55,780 And I thought that was as much an ingredient 98 00:04:55,780 --> 00:04:58,000 of mathematical knowledge as the theorems, 99 00:04:58,000 --> 00:05:00,767 how one theorem is proved from another theorem. 100 00:05:00,767 --> 00:05:02,350 And of course there's interconnections 101 00:05:02,350 --> 00:05:03,900 between the theorems and the examples 102 00:05:03,900 --> 00:05:06,280 so my initial work was on building up 103 00:05:06,280 --> 00:05:08,800 this representation that made room 104 00:05:08,800 --> 00:05:13,840 for these sort of non-logical or non-Sunday school mathematics 105 00:05:13,840 --> 00:05:15,740 kinds of objects. 106 00:05:15,740 --> 00:05:20,056 And so-- 107 00:05:20,056 --> 00:05:22,810 Had any other stuff been done like that before? 108 00:05:22,810 --> 00:05:28,360 No, I mean, there certainly were compendia of counterexamples 109 00:05:28,360 --> 00:05:29,560 in mathematics. 110 00:05:29,560 --> 00:05:32,400 There certainly were books of theorems, 111 00:05:32,400 --> 00:05:35,440 I mean obviously classical mathematics is all about that. 112 00:05:35,440 --> 00:05:37,850 And there were a few instances of how 113 00:05:37,850 --> 00:05:42,220 to think about mathematics such as Polya. 114 00:05:42,220 --> 00:05:48,850 But a representational scheme in an environment like this, no. 115 00:05:48,850 --> 00:05:51,730 And so there was the representational substrate 116 00:05:51,730 --> 00:05:54,430 and epistemology of mathematics if you want. 117 00:05:54,430 --> 00:05:59,260 A way to use that representation for the mathematician 118 00:05:59,260 --> 00:06:01,600 who knows what she wants. 119 00:06:01,600 --> 00:06:05,260 And then I had this idea that with all the structure 120 00:06:05,260 --> 00:06:08,740 you should be able to describe what 121 00:06:08,740 --> 00:06:12,070 it means to understand the body of mathematics well, 122 00:06:12,070 --> 00:06:16,000 and then to help someone else learn how to understand 123 00:06:16,000 --> 00:06:17,430 a body of mathematics. 124 00:06:17,430 --> 00:06:20,140 And this is more Polya-esque in the sense 125 00:06:20,140 --> 00:06:23,350 that if someone-- if you learn a theorem you should ask well 126 00:06:23,350 --> 00:06:26,080 is the converse true? 127 00:06:26,080 --> 00:06:29,870 And if it's not true well, what's the counterexample that 128 00:06:29,870 --> 00:06:30,910 disproves it? 129 00:06:30,910 --> 00:06:33,490 You should learn to ask those kinds of questions 130 00:06:33,490 --> 00:06:39,160 to weave together this landscape of mathematical knowledge 131 00:06:39,160 --> 00:06:41,590 and to know your way around it and know what's important 132 00:06:41,590 --> 00:06:44,150 and know what's just merely a stepping stone 133 00:06:44,150 --> 00:06:47,200 and know how certain examples can tie together 134 00:06:47,200 --> 00:06:48,490 different concepts. 135 00:06:48,490 --> 00:06:51,700 How some ideas permeate things, the heuristic 136 00:06:51,700 --> 00:06:53,300 content of mathematics. 137 00:06:53,300 --> 00:06:56,586 So for instance to try extreme points 138 00:06:56,586 --> 00:06:58,030 you know, that's in programming. 139 00:06:58,030 --> 00:07:01,480 You know, and there's a lot of this knowledge that's 140 00:07:01,480 --> 00:07:02,360 very important. 141 00:07:02,360 --> 00:07:06,790 And so it like this should all be part of the bundle 142 00:07:06,790 --> 00:07:10,630 and it has a landscape and it has ways to navigate it. 143 00:07:10,630 --> 00:07:14,260 And so that became my thesis and that my official thesis 144 00:07:14,260 --> 00:07:17,750 advisor was Seymour Papert and all my committee 145 00:07:17,750 --> 00:07:22,850 my co-advisor was Marvin Minsky and it was a lot of fun. 146 00:07:22,850 --> 00:07:23,350 Cool. 147 00:07:23,350 --> 00:07:26,420 What was the environment like at MIT at the time? 148 00:07:26,420 --> 00:07:28,580 Who were you collaborating with? 149 00:07:28,580 --> 00:07:31,910 Well, I wasn't collaborating with anybody on my thesis. 150 00:07:31,910 --> 00:07:34,850 I mean, that was pretty much a solo enterprise. 151 00:07:34,850 --> 00:07:38,600 The MIT style seems to be to set you loose 152 00:07:38,600 --> 00:07:41,040 and give you all the freedom you want. 153 00:07:41,040 --> 00:07:42,680 And for me that was great because I 154 00:07:42,680 --> 00:07:44,450 knew what I wanted to do. 155 00:07:44,450 --> 00:07:46,040 In retrospect, it would have been 156 00:07:46,040 --> 00:07:50,240 better to have had more critical interactions 157 00:07:50,240 --> 00:07:53,160 and perhaps more feedback on the goodness of the ideas. 158 00:07:53,160 --> 00:07:55,340 But when I was in the, in the throes of it 159 00:07:55,340 --> 00:07:57,950 I quite liked the freedom I had. 160 00:07:57,950 --> 00:08:01,673 So I guess my contacts were some of the mathematicians, 161 00:08:01,673 --> 00:08:03,090 some people like Ken Hoffman who's 162 00:08:03,090 --> 00:08:06,030 an analyst, and people in the AI lab, 163 00:08:06,030 --> 00:08:08,960 particularly Marvin and Seymour. 164 00:08:08,960 --> 00:08:12,535 What was that atmosphere like at the AI lab then? 165 00:08:12,535 --> 00:08:17,360 I don't know, pretty freewheeling, pretty tolerant 166 00:08:17,360 --> 00:08:23,090 of all kinds of new ideas, very individualistic. 167 00:08:23,090 --> 00:08:26,285 I think a lot of strong, smart individuals 168 00:08:26,285 --> 00:08:29,170 you know who could sort of-- 169 00:08:29,170 --> 00:08:32,960 it wasn't a project oriented place say 170 00:08:32,960 --> 00:08:34,520 as much as some place like Stanford 171 00:08:34,520 --> 00:08:37,340 was which at least to me appeared to be more oriented 172 00:08:37,340 --> 00:08:40,610 around projects that involved a lot of graduate students 173 00:08:40,610 --> 00:08:43,400 working, there seem to be more solo enterprises. 174 00:08:43,400 --> 00:08:46,820 And of course, I spent my time in a way in three 175 00:08:46,820 --> 00:08:48,380 different entities at MIT. 176 00:08:48,380 --> 00:08:52,190 There was the AI lab, there was the math department, 177 00:08:52,190 --> 00:08:54,370 and there was another entity which no longer exists 178 00:08:54,370 --> 00:08:56,120 called the Division For Study and Research 179 00:08:56,120 --> 00:09:01,700 in Education which was trying to marry lots of disciplines 180 00:09:01,700 --> 00:09:03,635 such as philosophy, linguistics, and AI 181 00:09:03,635 --> 00:09:10,610 and mathematics that had along the perspective of education 182 00:09:10,610 --> 00:09:11,575 and-- 183 00:09:11,575 --> 00:09:13,700 but it was broader than just teaching other people. 184 00:09:13,700 --> 00:09:15,830 It was more of an AI cognitive science 185 00:09:15,830 --> 00:09:18,620 kind of orientation and understanding 186 00:09:18,620 --> 00:09:20,910 how thought goes in these other disciplines. 187 00:09:20,910 --> 00:09:23,180 So there were people from [INAUDIBLE] 188 00:09:23,180 --> 00:09:25,545 from the architecture school and there 189 00:09:25,545 --> 00:09:27,845 was Jeanne Bamberger from music and it 190 00:09:27,845 --> 00:09:29,540 was-- that was a very interesting place. 191 00:09:29,540 --> 00:09:30,373 It was a cool place. 192 00:09:30,373 --> 00:09:33,083 It was too bad that the enterprise didn't continue. 193 00:09:33,083 --> 00:09:35,000 So speaking of those kinds of things what kind 194 00:09:35,000 --> 00:09:38,060 of collaborations do you think exist between different fields 195 00:09:38,060 --> 00:09:39,600 and Artificial Intelligence? 196 00:09:39,600 --> 00:09:43,260 Or where potential is for that to happen? 197 00:09:43,260 --> 00:09:48,450 Well, I think some non-computer science disciplines 198 00:09:48,450 --> 00:09:52,230 have more intimate collaborations 199 00:09:52,230 --> 00:09:54,460 with AI than others. 200 00:09:54,460 --> 00:09:59,640 So for instance I think things like biology, neuroscience, 201 00:09:59,640 --> 00:10:05,070 chemistry, VLSI, CAD/CAM engineering, 202 00:10:05,070 --> 00:10:08,790 things like that have had quite an intimate relationship 203 00:10:08,790 --> 00:10:12,750 with AI and that use the AI techniques. 204 00:10:12,750 --> 00:10:16,350 Other disciplines are more at a distance 205 00:10:16,350 --> 00:10:22,120 such as law and maybe ethics and some of the softer disciplines. 206 00:10:22,120 --> 00:10:24,610 But I think the interaction there is just as interesting. 207 00:10:24,610 --> 00:10:28,410 So in fact, after I had done this work in mathematics 208 00:10:28,410 --> 00:10:30,000 I found out that most of my colleagues 209 00:10:30,000 --> 00:10:34,080 were math-a-phobic or really didn't want to get 210 00:10:34,080 --> 00:10:35,430 into it to the degree I did. 211 00:10:35,430 --> 00:10:38,460 And I felt that the phenomenon of examples 212 00:10:38,460 --> 00:10:39,960 and how you reason with them and how 213 00:10:39,960 --> 00:10:43,870 you build them should be present in other disciplines. 214 00:10:43,870 --> 00:10:45,660 So for instance, the way I wrote programs. 215 00:10:45,660 --> 00:10:48,900 Very often if I had to write a new search program I 216 00:10:48,900 --> 00:10:51,000 would look up the last one I wrote and tweak it 217 00:10:51,000 --> 00:10:53,670 a bit, which is essentially a case based for example based 218 00:10:53,670 --> 00:10:56,190 method of retrieving a good enough example 219 00:10:56,190 --> 00:10:57,600 and then pushing it further. 220 00:10:57,600 --> 00:11:00,060 That's also the way you can go counterexample. 221 00:11:00,060 --> 00:11:03,990 So I tried to look at other disciplines besides mathematics 222 00:11:03,990 --> 00:11:07,500 where the sort of problem solving behavior was useful. 223 00:11:07,500 --> 00:11:08,670 And the one I-- 224 00:11:08,670 --> 00:11:10,500 it's useful in linguistics, but the one 225 00:11:10,500 --> 00:11:15,060 I really love and have since spent most of my career in 226 00:11:15,060 --> 00:11:19,260 is the legal domain because at least Anglo-American law 227 00:11:19,260 --> 00:11:23,490 and other types of law as well but to a lesser extent 228 00:11:23,490 --> 00:11:25,120 are based on cases. 229 00:11:25,120 --> 00:11:27,090 In other words examples. 230 00:11:27,090 --> 00:11:31,020 And what's interesting about cases and examples 231 00:11:31,020 --> 00:11:34,950 in legal domains is there's a difference between real cases 232 00:11:34,950 --> 00:11:37,200 and there's no hard definition of that and essentially 233 00:11:37,200 --> 00:11:40,200 something that has been brought to the legal system 234 00:11:40,200 --> 00:11:41,400 and hypotheticals. 235 00:11:41,400 --> 00:11:44,070 Hypotheticals are made up cases, very good [INAUDIBLE] 236 00:11:44,070 --> 00:11:46,950 experiments streamline kind of examples. 237 00:11:46,950 --> 00:11:48,960 And they're more like counterexamples 238 00:11:48,960 --> 00:11:50,700 in the sense of mathematics. 239 00:11:50,700 --> 00:11:53,748 And they get used in Socratic dialogue, 240 00:11:53,748 --> 00:11:55,290 the back and forth when you're trying 241 00:11:55,290 --> 00:11:58,140 to test your ideas or the professors trying to push you 242 00:11:58,140 --> 00:12:00,570 around to see whether what you're talking about, 243 00:12:00,570 --> 00:12:04,080 or an oral argument in appellate situations 244 00:12:04,080 --> 00:12:06,960 where someone's trying to see will your arguments stand up 245 00:12:06,960 --> 00:12:09,700 if the situation is slightly different. 246 00:12:09,700 --> 00:12:13,770 So I think there's a nice potential relationship 247 00:12:13,770 --> 00:12:17,075 with disciplines like the law and a few of us 248 00:12:17,075 --> 00:12:17,980 have pursued it. 249 00:12:17,980 --> 00:12:20,263 But it's not as widespread as I'd 250 00:12:20,263 --> 00:12:21,930 like because I think it's a rich domain. 251 00:12:21,930 --> 00:12:26,370 So what can AI do for the field of law? 252 00:12:26,370 --> 00:12:27,660 Well I think AI can do for-- 253 00:12:27,660 --> 00:12:29,470 What kind of applications are there in the near future? 254 00:12:29,470 --> 00:12:30,845 Well, that's a different question 255 00:12:30,845 --> 00:12:32,490 or that's a sub-question. 256 00:12:32,490 --> 00:12:36,030 I think in the general sense AI can 257 00:12:36,030 --> 00:12:41,460 do for law what it can do for many areas in that it gives 258 00:12:41,460 --> 00:12:43,512 these other people in these other areas 259 00:12:43,512 --> 00:12:44,970 who want to think about their field 260 00:12:44,970 --> 00:12:47,410 a framework for thinking about their field. 261 00:12:47,410 --> 00:12:49,920 So if you're interested in describing problem solving 262 00:12:49,920 --> 00:12:52,240 or what it means to think like a lawyer 263 00:12:52,240 --> 00:12:54,645 and to paraphrase Kingsfeld from the pictures. 264 00:12:54,645 --> 00:12:57,180 265 00:12:57,180 --> 00:13:00,300 The people in legal philosophy or who 266 00:13:00,300 --> 00:13:04,380 look at legal reasoning on the legal side I don't think 267 00:13:04,380 --> 00:13:06,960 have a very rich vocabulary for describing what it 268 00:13:06,960 --> 00:13:09,300 means to think like a lawyer. 269 00:13:09,300 --> 00:13:12,060 And for instance the doctrine of precedent 270 00:13:12,060 --> 00:13:15,810 says that similar cases should be decided similarly 271 00:13:15,810 --> 00:13:17,520 but there's no further explanation 272 00:13:17,520 --> 00:13:20,340 of what it means for two cases to be similar. 273 00:13:20,340 --> 00:13:24,990 Well in fact, there's lots of ways to describe similarities. 274 00:13:24,990 --> 00:13:31,260 And AI gives you a handle on making quite precise what sort 275 00:13:31,260 --> 00:13:34,110 of similarity you're interested in and how to manipulate it 276 00:13:34,110 --> 00:13:36,070 and how to change it. 277 00:13:36,070 --> 00:13:39,750 And so AI studies of legal reasoning 278 00:13:39,750 --> 00:13:43,890 can shed light on the jurisprudence of a case based 279 00:13:43,890 --> 00:13:47,370 reasoning and what it means for cases to be similar 280 00:13:47,370 --> 00:13:51,870 and how do you create those nasty hypotheticals anyway? 281 00:13:51,870 --> 00:13:57,690 And that not only is kind of a philosophical jurisprudential 282 00:13:57,690 --> 00:14:00,870 enterprise, but it also could be an educational enterprise 283 00:14:00,870 --> 00:14:05,070 if you want to teach law students how to push back 284 00:14:05,070 --> 00:14:08,580 with good hypotheticals or you want people in practice 285 00:14:08,580 --> 00:14:12,630 to think about how can you imagine situations that 286 00:14:12,630 --> 00:14:14,708 might get you into trouble? 287 00:14:14,708 --> 00:14:16,500 You need to think about hypotheticals, well 288 00:14:16,500 --> 00:14:19,420 how should you come up with those kinds of cases? 289 00:14:19,420 --> 00:14:23,430 So I think the kind of work that we do in AI demystifies 290 00:14:23,430 --> 00:14:27,550 some of those processes which seem quite magical. 291 00:14:27,550 --> 00:14:29,940 I mean, if you think back to your undergraduate years 292 00:14:29,940 --> 00:14:33,390 taking real analysis how did those people come up 293 00:14:33,390 --> 00:14:35,340 with those wild counterexamples? 294 00:14:35,340 --> 00:14:36,840 It seemed like magic. 295 00:14:36,840 --> 00:14:40,440 Well it's not magic, at least it's not all magic. 296 00:14:40,440 --> 00:14:42,320 And the there are some models for how 297 00:14:42,320 --> 00:14:44,360 to do that that can shed light on it 298 00:14:44,360 --> 00:14:47,300 and those are AI's contribution, not only to the field 299 00:14:47,300 --> 00:14:49,320 but to how to teach the field. 300 00:14:49,320 --> 00:14:52,640 Well, so in the future do you see machines or computers 301 00:14:52,640 --> 00:14:57,180 as being able to give advice to lawyers 302 00:14:57,180 --> 00:14:58,860 or politicians like in a courtroom 303 00:14:58,860 --> 00:14:59,985 or in a government setting? 304 00:14:59,985 --> 00:15:03,150 I see them as able assistants. 305 00:15:03,150 --> 00:15:05,610 I don't ever see them in the role of Judge. 306 00:15:05,610 --> 00:15:08,010 I think that's silly because the law is 307 00:15:08,010 --> 00:15:10,240 much more complex than subtle. 308 00:15:10,240 --> 00:15:10,740 Right. 309 00:15:10,740 --> 00:15:12,730 It's a very subtle enterprise. 310 00:15:12,730 --> 00:15:18,990 But I think they certainly could help you to perfect 311 00:15:18,990 --> 00:15:22,850 or at least uncover some of the obvious pitfalls 312 00:15:22,850 --> 00:15:24,890 in your arguments. 313 00:15:24,890 --> 00:15:29,360 They might help a beginning lawyer or someone 314 00:15:29,360 --> 00:15:33,020 who's not an expert in an area operate with higher expertise. 315 00:15:33,020 --> 00:15:34,490 I don't think they're necessarily 316 00:15:34,490 --> 00:15:40,730 going to form the world's expert on the topic of their expertise 317 00:15:40,730 --> 00:15:43,063 but I think they will enable. 318 00:15:43,063 --> 00:15:44,230 I think they can be helpful. 319 00:15:44,230 --> 00:15:45,470 That's a very exciting potential. 320 00:15:45,470 --> 00:15:47,090 But they're-- but they're not going-- 321 00:15:47,090 --> 00:15:50,030 I don't-- my goal has not been to replace the lawyer 322 00:15:50,030 --> 00:15:52,010 and it's certainly not to replace the judge. 323 00:15:52,010 --> 00:15:55,660 I mean, I don't even consider that an interesting question. 324 00:15:55,660 --> 00:15:56,430 But-- 325 00:15:56,430 --> 00:15:56,930 Cool. 326 00:15:56,930 --> 00:15:59,000 So are there-- just to-- are there any 327 00:15:59,000 --> 00:16:01,480 closing stories or anecdotes that you'd 328 00:16:01,480 --> 00:16:05,450 like the tell about your experiences [INAUDIBLE]?? 329 00:16:05,450 --> 00:16:09,030 Well, maybe this comment that-- 330 00:16:09,030 --> 00:16:12,290 so going back to my interesting cases 331 00:16:12,290 --> 00:16:15,660 and going back to the interconnections of knowledge, 332 00:16:15,660 --> 00:16:19,700 in fact that was influenced a lot by my undergraduate 333 00:16:19,700 --> 00:16:22,130 experience being in the computer science department where 334 00:16:22,130 --> 00:16:25,860 hypertext was being developed. 335 00:16:25,860 --> 00:16:28,820 And so I mean, that's an influence 336 00:16:28,820 --> 00:16:30,590 on my way of thinking that I guess 337 00:16:30,590 --> 00:16:33,470 is pretty pervasive but it comes from something 338 00:16:33,470 --> 00:16:36,560 outside of mathematics which was my supposedly home 339 00:16:36,560 --> 00:16:40,160 discipline, outside of AI which is my practicing discipline. 340 00:16:40,160 --> 00:16:43,760 So I think maybe one of the lessons learned is you know, 341 00:16:43,760 --> 00:16:48,410 it's a multifaceted enterprise and you 342 00:16:48,410 --> 00:16:51,290 get a lot of information, a lot of ideas from elsewhere. 343 00:16:51,290 --> 00:16:52,250 And it's a lot of fun. 344 00:16:52,250 --> 00:16:55,370 It's a lot of fun to learn new things such as a new discipline 345 00:16:55,370 --> 00:16:56,750 like the law. 346 00:16:56,750 --> 00:16:58,610 There's lots of stories but you know. 347 00:16:58,610 --> 00:17:00,560 Yeah. 348 00:17:00,560 --> 00:17:01,210 So. 349 00:17:01,210 --> 00:17:01,710 Great. 350 00:17:01,710 --> 00:17:02,750 Well thank you so much. 351 00:17:02,750 --> 00:17:04,310 It was my pleasure. 352 00:17:04,310 --> 00:17:05,500 OK. 353 00:17:05,500 --> 00:17:06,000