1 00:00:00,000 --> 00:00:04,980 2 00:00:04,980 --> 00:00:08,010 So why don't we start out just by introducing yourself. 3 00:00:08,010 --> 00:00:09,330 OK. 4 00:00:09,330 --> 00:00:10,920 So my name's Bill Swartout. 5 00:00:10,920 --> 00:00:17,190 And I'm now at USC, University of Southern California. 6 00:00:17,190 --> 00:00:20,740 I got my undergraduate degree at Stanford, 7 00:00:20,740 --> 00:00:22,980 which is where I actually first got involved 8 00:00:22,980 --> 00:00:24,870 with artificial intelligence. 9 00:00:24,870 --> 00:00:34,200 This was-- I graduated in 1974 and worked with Cordell Green 10 00:00:34,200 --> 00:00:36,900 out there, who was then on the faculty, 11 00:00:36,900 --> 00:00:40,770 and worked on the problem of-- 12 00:00:40,770 --> 00:00:43,170 he was interested in automatic programming, basically. 13 00:00:43,170 --> 00:00:45,070 Could you get it so that, you know, 14 00:00:45,070 --> 00:00:48,720 computer from some nonprogrammatic description 15 00:00:48,720 --> 00:00:50,940 like input-output pairs or something like that 16 00:00:50,940 --> 00:00:54,870 could actually infer what the program would 17 00:00:54,870 --> 00:00:58,950 be that would generate that output from that input 18 00:00:58,950 --> 00:01:00,840 and generalize it. 19 00:01:00,840 --> 00:01:04,290 And so I was an undergraduate. 20 00:01:04,290 --> 00:01:08,280 And there was a program that had been written 21 00:01:08,280 --> 00:01:15,990 by a graduate student that actually did infer list 22 00:01:15,990 --> 00:01:19,050 programs from example input-output pairs. 23 00:01:19,050 --> 00:01:22,580 And it worked on exactly one case. 24 00:01:22,580 --> 00:01:25,140 And so my job-- 25 00:01:25,140 --> 00:01:28,500 I took this course, and you had to have a project. 26 00:01:28,500 --> 00:01:30,630 And, of course, my job was basically 27 00:01:30,630 --> 00:01:33,030 to get it to work on the other 15 examples 28 00:01:33,030 --> 00:01:38,380 that it was supposedly originally designed to work on. 29 00:01:38,380 --> 00:01:42,000 And so this turned out to be, actually, 30 00:01:42,000 --> 00:01:45,570 quite a intense debugging effort. 31 00:01:45,570 --> 00:01:50,280 And I remember I finally got it to work on about 12 32 00:01:50,280 --> 00:01:52,420 or 13 of the cases it-- 33 00:01:52,420 --> 00:01:52,920 Impressive. 34 00:01:52,920 --> 00:01:54,450 --it was supposed to work on. 35 00:01:54,450 --> 00:01:57,400 And about 4 o'clock in the morning 36 00:01:57,400 --> 00:02:00,750 I had an experience where I realized, you know, 37 00:02:00,750 --> 00:02:03,840 how your memory can be impaired as you get sleepier 38 00:02:03,840 --> 00:02:05,760 and sleepier and sleepier. 39 00:02:05,760 --> 00:02:09,160 And what was happening was I was working on the final thing 40 00:02:09,160 --> 00:02:11,160 that I was going to do, the final example that I 41 00:02:11,160 --> 00:02:12,180 was going to do. 42 00:02:12,180 --> 00:02:15,180 And there was some bug that was causing it not to work. 43 00:02:15,180 --> 00:02:17,050 And I figured it out. 44 00:02:17,050 --> 00:02:19,260 But by the time I had figured it out, 45 00:02:19,260 --> 00:02:20,850 I realized that my memory-- 46 00:02:20,850 --> 00:02:23,430 I had lost all of the memory traces 47 00:02:23,430 --> 00:02:25,680 back to where I was starting from on this. 48 00:02:25,680 --> 00:02:27,990 So I said, OK, time to go to bed. 49 00:02:27,990 --> 00:02:32,040 It'll be fine if it works on 13 cases. 50 00:02:32,040 --> 00:02:35,460 So that was my sort of introduction to, 51 00:02:35,460 --> 00:02:39,060 you know, working in a small way, at least, 52 00:02:39,060 --> 00:02:41,160 to working on AI. 53 00:02:41,160 --> 00:02:46,620 And later Cordell and I and a graduate student 54 00:02:46,620 --> 00:02:49,930 who was involved published a paper, my first paper in an AI 55 00:02:49,930 --> 00:02:51,738 conference on that work. 56 00:02:51,738 --> 00:02:52,780 And that was kind of fun. 57 00:02:52,780 --> 00:02:53,230 As an undergrad. 58 00:02:53,230 --> 00:02:53,730 Not bad. 59 00:02:53,730 --> 00:02:54,810 Yeah, It was kind of fun. 60 00:02:54,810 --> 00:03:00,100 And then I went to MIT and worked 61 00:03:00,100 --> 00:03:03,100 with Peter Solovich in the-- 62 00:03:03,100 --> 00:03:06,040 well, initially I worked with Bill Martin 63 00:03:06,040 --> 00:03:08,590 and then later with Peter Solovich 64 00:03:08,590 --> 00:03:11,410 on the clinical decision-making group there, 65 00:03:11,410 --> 00:03:18,220 which was basically concerned with trying to help physicians 66 00:03:18,220 --> 00:03:20,350 by writing programs that could give 67 00:03:20,350 --> 00:03:25,660 advice about diagnoses and therapy recommendations 68 00:03:25,660 --> 00:03:26,410 and all of that. 69 00:03:26,410 --> 00:03:27,700 He was [INAUDIBLE]. 70 00:03:27,700 --> 00:03:28,320 Oh really? 71 00:03:28,320 --> 00:03:28,900 OK. 72 00:03:28,900 --> 00:03:33,370 So the program that I worked on was the Digitalis Therapy 73 00:03:33,370 --> 00:03:35,290 Advisor, which was-- 74 00:03:35,290 --> 00:03:41,830 Digitalis is a drug that's actually still given today. 75 00:03:41,830 --> 00:03:47,580 It was first prescribed or described 76 00:03:47,580 --> 00:03:50,790 in the medical literature in like the 1700s. 77 00:03:50,790 --> 00:03:52,230 So it goes way back. 78 00:03:52,230 --> 00:03:55,200 It's derived from foxglove plant. 79 00:03:55,200 --> 00:03:57,510 And it's a drug though that's very difficult 80 00:03:57,510 --> 00:04:00,300 to give properly because the difference 81 00:04:00,300 --> 00:04:02,952 between a toxic dose-- 82 00:04:02,952 --> 00:04:04,410 one that's going to make you sick-- 83 00:04:04,410 --> 00:04:05,993 and one that's going to be therapeutic 84 00:04:05,993 --> 00:04:08,970 is only about a factor of two compared with, say, something 85 00:04:08,970 --> 00:04:13,830 like aspirin where the error range is a factor of about 100. 86 00:04:13,830 --> 00:04:19,529 And in fact, the variability on sensitivity between patients 87 00:04:19,529 --> 00:04:23,140 is bigger than the therapeutic window. 88 00:04:23,140 --> 00:04:26,040 So you have to, essentially, customize the dose 89 00:04:26,040 --> 00:04:28,560 for each particular patient that you're 90 00:04:28,560 --> 00:04:30,600 going to give this drug to. 91 00:04:30,600 --> 00:04:35,490 So Howie Silverman, who was a master's student 92 00:04:35,490 --> 00:04:37,260 at MIT, who was a year ahead of me, 93 00:04:37,260 --> 00:04:41,250 had written an initial version of this program. 94 00:04:41,250 --> 00:04:49,320 And the job Pete gave me was to basically, allow this program 95 00:04:49,320 --> 00:04:52,320 to explain its reasoning to a physician, 96 00:04:52,320 --> 00:04:57,510 the idea being that if you give a physician some advice 97 00:04:57,510 --> 00:05:00,930 but you don't also explain why this is the right thing to do, 98 00:05:00,930 --> 00:05:02,970 they won't accept the advice. 99 00:05:02,970 --> 00:05:05,510 And so, basically, I had to go back in 100 00:05:05,510 --> 00:05:09,390 and reprogram, develop, essentially, a new approach 101 00:05:09,390 --> 00:05:12,930 to programming that allowed the computer 102 00:05:12,930 --> 00:05:18,150 to take the routines and procedures that it had 103 00:05:18,150 --> 00:05:21,060 and paraphrase them into a form-- 104 00:05:21,060 --> 00:05:26,560 into English in a reasonably understandable sort of way, 105 00:05:26,560 --> 00:05:30,960 even for somebody who isn't a computer genius. 106 00:05:30,960 --> 00:05:35,730 And so that became my master's thesis, essentially. 107 00:05:35,730 --> 00:05:41,370 And I then went on on my PhD to look at the issue of not only 108 00:05:41,370 --> 00:05:44,620 how can you get a program to explain what it's doing, 109 00:05:44,620 --> 00:05:46,950 which is what looking at the procedures 110 00:05:46,950 --> 00:05:52,590 does, but also look at how can you get the program to explain 111 00:05:52,590 --> 00:05:54,610 why it's doing what it's doing? 112 00:05:54,610 --> 00:05:56,970 In other words, give you the rationale 113 00:05:56,970 --> 00:06:00,330 behind the recommendations that it's 114 00:06:00,330 --> 00:06:03,305 making that was sort of the causal reasoning. 115 00:06:03,305 --> 00:06:04,680 The problem with that, the reason 116 00:06:04,680 --> 00:06:06,660 why that was particularly difficult 117 00:06:06,660 --> 00:06:09,900 is that that information, the why, 118 00:06:09,900 --> 00:06:11,820 the why the program is doing what it's doing, 119 00:06:11,820 --> 00:06:13,890 doesn't have to be in the program for it 120 00:06:13,890 --> 00:06:15,060 to work correctly. 121 00:06:15,060 --> 00:06:17,520 You know, you can get a rule that says, 122 00:06:17,520 --> 00:06:20,700 if you see this and that phenomenon, 123 00:06:20,700 --> 00:06:23,117 you know, just reduce the dose by 20%. 124 00:06:23,117 --> 00:06:25,200 But it doesn't tell you-- it does the right thing, 125 00:06:25,200 --> 00:06:27,880 but it doesn't tell you why it's doing what it's doing. 126 00:06:27,880 --> 00:06:32,070 And so I basically looked into developing some techniques 127 00:06:32,070 --> 00:06:38,670 for getting it so that we could build a automatic programmer 128 00:06:38,670 --> 00:06:42,450 that would start from much more abstract knowledge 129 00:06:42,450 --> 00:06:45,930 of the domain and problem-solving principles 130 00:06:45,930 --> 00:06:48,840 and then from that derive-- 131 00:06:48,840 --> 00:06:53,790 infer the working system, keeping behind, essentially, 132 00:06:53,790 --> 00:06:57,210 the mental breadcrumbs of where this procedure came 133 00:06:57,210 --> 00:06:59,580 from in this causal reasoning chain 134 00:06:59,580 --> 00:07:03,810 so that then, you know, you were running the program and said, 135 00:07:03,810 --> 00:07:05,500 OK, you know, why is it doing this? 136 00:07:05,500 --> 00:07:09,000 It would go back and say, well, there's this causal interaction 137 00:07:09,000 --> 00:07:12,007 that happens between the drug and what you're-- you know, 138 00:07:12,007 --> 00:07:12,840 and this phenomenon. 139 00:07:12,840 --> 00:07:15,570 And if you don't do this, you know, bad things will happen. 140 00:07:15,570 --> 00:07:17,350 And here's what they are. 141 00:07:17,350 --> 00:07:19,170 And so that was pretty cool. 142 00:07:19,170 --> 00:07:20,180 That was pretty fun. 143 00:07:20,180 --> 00:07:24,990 Could those kind of systems work to infer what-- 144 00:07:24,990 --> 00:07:29,100 I don't know-- why a certain diseases is caused? 145 00:07:29,100 --> 00:07:32,700 They would have information like that in the program. 146 00:07:32,700 --> 00:07:36,240 And the program-- the system that's 147 00:07:36,240 --> 00:07:39,280 either diagnosing-- building the diagnostic system 148 00:07:39,280 --> 00:07:43,350 or the therapeutic system would use that causal information 149 00:07:43,350 --> 00:07:47,070 together with some strategies for how to reason in the domain 150 00:07:47,070 --> 00:07:50,740 to produce, then, a system that would do the diagnosis. 151 00:07:50,740 --> 00:07:52,880 So yes, basically. 152 00:07:52,880 --> 00:07:53,580 Cool. 153 00:07:53,580 --> 00:07:57,000 And what do you think the most exciting potentials in AI 154 00:07:57,000 --> 00:07:57,870 are right now? 155 00:07:57,870 --> 00:07:58,650 Right now? 156 00:07:58,650 --> 00:08:02,940 Well, the area I'm working on is kind of actually-- 157 00:08:02,940 --> 00:08:04,813 if you look at the Dartmouth conference 158 00:08:04,813 --> 00:08:06,480 and the goals for that, which was really 159 00:08:06,480 --> 00:08:10,920 to produce a integrated artificial intelligence that 160 00:08:10,920 --> 00:08:17,430 really exhibited the whole gamut of human behavior, one 161 00:08:17,430 --> 00:08:20,910 of the things I'm working on now is virtual humans, which 162 00:08:20,910 --> 00:08:24,780 are computer-generated characters that appear 163 00:08:24,780 --> 00:08:28,110 in a simulation or a game. 164 00:08:28,110 --> 00:08:31,860 And the goal is basically to have 165 00:08:31,860 --> 00:08:35,250 them behave much like people. 166 00:08:35,250 --> 00:08:37,580 So you talk to them in natural language. 167 00:08:37,580 --> 00:08:39,081 They talk back to you. 168 00:08:39,081 --> 00:08:40,289 But they're not just talking. 169 00:08:40,289 --> 00:08:41,747 It's not just verbal communication. 170 00:08:41,747 --> 00:08:43,919 They're also using gestures where appropriate. 171 00:08:43,919 --> 00:08:47,380 They're using eye gaze as appropriate, so 172 00:08:47,380 --> 00:08:50,170 engaging the listener. 173 00:08:50,170 --> 00:08:51,790 And they're exhibiting emotions. 174 00:08:51,790 --> 00:08:55,210 So they're-- which is actually, interestingly, 175 00:08:55,210 --> 00:09:00,160 an area of research that AI really hasn't paid much 176 00:09:00,160 --> 00:09:03,400 attention to, I would argue, over the last 50 years. 177 00:09:03,400 --> 00:09:05,890 It's just starting to become something 178 00:09:05,890 --> 00:09:07,370 that we're looking at. 179 00:09:07,370 --> 00:09:10,000 And one of the things that I think is-- 180 00:09:10,000 --> 00:09:14,347 that we found in our work is that emotion-- 181 00:09:14,347 --> 00:09:16,930 we initially went into this we thought emotion would just sort 182 00:09:16,930 --> 00:09:18,325 of be an overlay, that, you know, 183 00:09:18,325 --> 00:09:20,590 wit would kind of be a modifier of behavior. 184 00:09:20,590 --> 00:09:23,950 So if you're speaking and you're angry, you kind of frown. 185 00:09:23,950 --> 00:09:26,260 But actually, what we're finding is 186 00:09:26,260 --> 00:09:31,360 that emotion cuts much deeper than that then. 187 00:09:31,360 --> 00:09:34,510 There are aspects of emotions that are pervasive. 188 00:09:34,510 --> 00:09:39,400 So, for example, it can affect the way 189 00:09:39,400 --> 00:09:40,820 that the system reasons. 190 00:09:40,820 --> 00:09:44,860 So if you're angry, if we as humans are angry, 191 00:09:44,860 --> 00:09:51,250 you know, we will tend to reach conclusions much more rapidly. 192 00:09:51,250 --> 00:09:54,220 If we're depressed, we'll be more deliberative. 193 00:09:54,220 --> 00:09:56,110 We'll think things through more. 194 00:09:56,110 --> 00:09:59,770 And it's clear that emotions are functioning 195 00:09:59,770 --> 00:10:02,770 as a kind of metacontrol on some of our reasoning 196 00:10:02,770 --> 00:10:04,330 processes that we use. 197 00:10:04,330 --> 00:10:07,990 Seems like, maybe, AI systems could 198 00:10:07,990 --> 00:10:11,530 benefit from some of that. 199 00:10:11,530 --> 00:10:17,890 It also seems like emotions provide another window, 200 00:10:17,890 --> 00:10:22,390 essentially, into the status of what's going on-- 201 00:10:22,390 --> 00:10:30,210 could provide a status of what's going on with these characters. 202 00:10:30,210 --> 00:10:34,570 Very often now if you talk in the military, 203 00:10:34,570 --> 00:10:37,670 if you talk to generals, they say, you know, 204 00:10:37,670 --> 00:10:39,580 we're getting reports from the field. 205 00:10:39,580 --> 00:10:42,550 We would much rather get a report from the field 206 00:10:42,550 --> 00:10:47,380 by radio rather than by some sort of email message. 207 00:10:47,380 --> 00:10:49,300 The reason is because they could hear 208 00:10:49,300 --> 00:10:53,380 in the tone of the voice of the soldier, you know, how good 209 00:10:53,380 --> 00:10:56,000 or how bad things are going. 210 00:10:56,000 --> 00:11:00,160 And so, it's clear that we, as people, use emotions 211 00:11:00,160 --> 00:11:02,560 as a kind of shorthand for a lot of things. 212 00:11:02,560 --> 00:11:04,990 And it's part of the communication. 213 00:11:04,990 --> 00:11:07,540 And it seems that if we're really 214 00:11:07,540 --> 00:11:09,730 going to build AI systems, you know, 215 00:11:09,730 --> 00:11:11,800 they have to have that capability 216 00:11:11,800 --> 00:11:14,018 for a lot of different reasons. 217 00:11:14,018 --> 00:11:15,310 What are some of those reasons? 218 00:11:15,310 --> 00:11:19,240 Well, one is there's this communication aspect. 219 00:11:19,240 --> 00:11:25,540 The other is that, actually, people 220 00:11:25,540 --> 00:11:27,880 expect if they're looking at something that 221 00:11:27,880 --> 00:11:29,920 looks like a person up on the screen, 222 00:11:29,920 --> 00:11:35,350 they, in fact, will ascribe emotions to that character 223 00:11:35,350 --> 00:11:36,850 whether they've been modeled or not. 224 00:11:36,850 --> 00:11:38,892 It's just because we're terribly anthropomorphic. 225 00:11:38,892 --> 00:11:41,080 You know, we think our pets have emotions. 226 00:11:41,080 --> 00:11:43,670 All sorts of things have emotions. 227 00:11:43,670 --> 00:11:47,590 And so if you're going to avoid miscuing people, you know, 228 00:11:47,590 --> 00:11:49,380 you've really got to get into the game 229 00:11:49,380 --> 00:11:54,760 of modeling these emotions. 230 00:11:54,760 --> 00:11:59,080 So that's kind of a very quick summary of why some 231 00:11:59,080 --> 00:12:00,672 of these things are important. 232 00:12:00,672 --> 00:12:02,380 Has there been a lot of cross-pollination 233 00:12:02,380 --> 00:12:06,130 between simulations and artificial Intelligence 234 00:12:06,130 --> 00:12:08,140 and the video game world? 235 00:12:08,140 --> 00:12:14,110 That is a area that is already quite warm 236 00:12:14,110 --> 00:12:16,120 and is really heating up. 237 00:12:16,120 --> 00:12:19,240 What's been happening with video games, as you know, 238 00:12:19,240 --> 00:12:23,800 is over the last 10 years or so, the big-- 239 00:12:23,800 --> 00:12:27,220 for a game company, the big differentiator for them 240 00:12:27,220 --> 00:12:28,490 was the graphics. 241 00:12:28,490 --> 00:12:33,100 So if next Christmas your game had the greatest graphics, 242 00:12:33,100 --> 00:12:34,110 you know, you had a hit. 243 00:12:34,110 --> 00:12:37,330 244 00:12:37,330 --> 00:12:40,630 What's happening now with the degree 245 00:12:40,630 --> 00:12:44,350 of the advancement in graphics cards 246 00:12:44,350 --> 00:12:47,050 and also the underlying middleware 247 00:12:47,050 --> 00:12:50,380 for producing graphics that our games have, 248 00:12:50,380 --> 00:12:52,933 it's getting to the point where, first of all, 249 00:12:52,933 --> 00:12:54,850 the graphics are getting pretty close to photo 250 00:12:54,850 --> 00:12:57,010 real in these environments. 251 00:12:57,010 --> 00:13:00,190 And pretty much all the games are kind of 252 00:13:00,190 --> 00:13:02,710 at a similar level in terms of graphics. 253 00:13:02,710 --> 00:13:05,698 Once you get to photo real, you can't go any further, really. 254 00:13:05,698 --> 00:13:07,490 I mean, you're dealing with the limitations 255 00:13:07,490 --> 00:13:09,370 now of the human eye. 256 00:13:09,370 --> 00:13:12,610 So that means they're recognizing 257 00:13:12,610 --> 00:13:15,460 that graphics isn't going to be the big differentiator for them 258 00:13:15,460 --> 00:13:16,320 anymore. 259 00:13:16,320 --> 00:13:20,110 And so what they're starting to turn to is behavior, 260 00:13:20,110 --> 00:13:24,700 looking at how do these things, these characters behave? 261 00:13:24,700 --> 00:13:27,460 The other thing that they're looking to is how can they 262 00:13:27,460 --> 00:13:31,840 expand the market base beyond, kind of, the first-person 263 00:13:31,840 --> 00:13:34,105 shooter-- 264 00:13:34,105 --> 00:13:38,050 typical male, younger male-- 265 00:13:38,050 --> 00:13:39,850 to a much broader audience? 266 00:13:39,850 --> 00:13:43,630 And what they're finding is that games 267 00:13:43,630 --> 00:13:50,110 that emphasize relationships tend to appeal more to-- 268 00:13:50,110 --> 00:13:52,580 are more appealing to, I should say, to women. 269 00:13:52,580 --> 00:13:54,970 And in fact, right now, The Sims 2 270 00:13:54,970 --> 00:13:58,600 is, I believe, the largest-selling computer 271 00:13:58,600 --> 00:14:00,890 game out there. 272 00:14:00,890 --> 00:14:06,370 And that's a basically nonviolent game 273 00:14:06,370 --> 00:14:08,800 that is really all about the relationships 274 00:14:08,800 --> 00:14:10,850 between these simulated characters. 275 00:14:10,850 --> 00:14:13,120 Now, if you look at The Sims, you know, 276 00:14:13,120 --> 00:14:18,370 the characters by AI standards are still pretty primitive. 277 00:14:18,370 --> 00:14:21,520 And so we're in this really interesting position 278 00:14:21,520 --> 00:14:23,350 from the standpoint of AI research 279 00:14:23,350 --> 00:14:25,600 that there's a lot of technology that we 280 00:14:25,600 --> 00:14:27,400 have in the lab that without, I think, 281 00:14:27,400 --> 00:14:29,770 without too much trouble, we can start 282 00:14:29,770 --> 00:14:34,720 to move out into these areas and really create some really 283 00:14:34,720 --> 00:14:39,130 interesting game experiences. 284 00:14:39,130 --> 00:14:44,970 I think there will be a real revolution in video games. 285 00:14:44,970 --> 00:14:46,690 Are there any other places where you see 286 00:14:46,690 --> 00:14:50,510 that sort of symbiosis growing? 287 00:14:50,510 --> 00:14:53,910 I think another area where this kind of thing-- 288 00:14:53,910 --> 00:14:57,250 I would say education is one, education and training. 289 00:14:57,250 --> 00:15:01,127 Basically, being able to construct simulations 290 00:15:01,127 --> 00:15:03,460 of various-- and one of the things we've been working on 291 00:15:03,460 --> 00:15:10,810 is a simulator that uses virtual humans for teaching people 292 00:15:10,810 --> 00:15:12,250 how to negotiate. 293 00:15:12,250 --> 00:15:16,210 So, basically, you're talking to a virtual human. 294 00:15:16,210 --> 00:15:18,940 And you're trying to strike a deal 295 00:15:18,940 --> 00:15:23,110 with this virtual human, who has his own internal goals 296 00:15:23,110 --> 00:15:24,730 and beliefs and desires. 297 00:15:24,730 --> 00:15:26,860 And as you talk to him, you know, 298 00:15:26,860 --> 00:15:30,730 he's appraising what you say in terms of his own desires 299 00:15:30,730 --> 00:15:32,410 and then reacting and responding. 300 00:15:32,410 --> 00:15:35,620 And basically, the characters respond 301 00:15:35,620 --> 00:15:39,520 using psychologically-motivated strategies. 302 00:15:39,520 --> 00:15:41,565 So they do, you know, they have a variety 303 00:15:41,565 --> 00:15:42,940 of different strategies that they 304 00:15:42,940 --> 00:15:45,200 use in responding to people. 305 00:15:45,200 --> 00:15:47,830 So this kind of thing-- you can use this, 306 00:15:47,830 --> 00:15:53,050 you can imagine using this not only in educating people 307 00:15:53,050 --> 00:15:54,820 about how negotiation takes place 308 00:15:54,820 --> 00:15:58,660 but also as a way of doing sort of role-playing 309 00:15:58,660 --> 00:16:02,450 for a variety of business situations, 310 00:16:02,450 --> 00:16:06,250 for example, where you might want to give people practice 311 00:16:06,250 --> 00:16:09,100 in how to close a deal. 312 00:16:09,100 --> 00:16:12,490 You might also want to use it as a way for human resources 313 00:16:12,490 --> 00:16:16,900 in how to deal with a problem employee. 314 00:16:16,900 --> 00:16:19,600 These are things that all managers hate to deal with. 315 00:16:19,600 --> 00:16:22,300 So giving them the ability to rehearse 316 00:16:22,300 --> 00:16:26,210 this with a artificial character might be really appealing. 317 00:16:26,210 --> 00:16:30,310 So have you encountered any really amusing instances 318 00:16:30,310 --> 00:16:34,690 of versions of these programs or even just in your research 319 00:16:34,690 --> 00:16:36,820 group over the years? 320 00:16:36,820 --> 00:16:37,750 Well, let's see. 321 00:16:37,750 --> 00:16:41,100 322 00:16:41,100 --> 00:16:46,530 So I believe-- I mean, give me a second to think about that. 323 00:16:46,530 --> 00:16:51,930 324 00:16:51,930 --> 00:16:53,902 In some of the-- 325 00:16:53,902 --> 00:16:55,860 well, I'll tell you one thing that was actually 326 00:16:55,860 --> 00:16:58,890 quite moving to me. 327 00:16:58,890 --> 00:17:02,730 We've been working a lot with the US Army. 328 00:17:02,730 --> 00:17:11,730 And we constructed a simulation of a scene in Bosnia. 329 00:17:11,730 --> 00:17:13,710 And it was using virtual humans. 330 00:17:13,710 --> 00:17:17,958 And the idea was we wanted to put a-- 331 00:17:17,958 --> 00:17:19,647 it was just a prototype system. 332 00:17:19,647 --> 00:17:21,480 But the idea was we wanted to be able to put 333 00:17:21,480 --> 00:17:25,260 somebody like a lieutenant into a typically-- 334 00:17:25,260 --> 00:17:27,524 a dilemma that might occur, typically, 335 00:17:27,524 --> 00:17:33,120 in a operation but the kind of thing 336 00:17:33,120 --> 00:17:37,540 that is not typically part of the standard training manual. 337 00:17:37,540 --> 00:17:39,570 So this was a situation where you're 338 00:17:39,570 --> 00:17:41,220 in Bosnia with your platoon. 339 00:17:41,220 --> 00:17:44,430 You're going to reinforce another unit that's 340 00:17:44,430 --> 00:17:46,230 dealing with some civil unrest. 341 00:17:46,230 --> 00:17:52,290 And on the way, your group, your platoon 342 00:17:52,290 --> 00:17:55,350 gets involved with a traffic accident. 343 00:17:55,350 --> 00:17:57,920 And there's a small child on the ground seriously injured. 344 00:17:57,920 --> 00:18:00,060 And a crowd starts to form. 345 00:18:00,060 --> 00:18:03,630 TV camera crew shows up and starts to film things. 346 00:18:03,630 --> 00:18:05,100 This is all done virtually. 347 00:18:05,100 --> 00:18:07,020 And you know, now you, as the lieutenant, 348 00:18:07,020 --> 00:18:08,640 have to figure out what to do. 349 00:18:08,640 --> 00:18:11,700 Well, we showed this to a bunch of army people. 350 00:18:11,700 --> 00:18:14,250 And actually, some of them were really 351 00:18:14,250 --> 00:18:17,850 moved by this because it was-- 352 00:18:17,850 --> 00:18:20,560 while the graphics weren't perfect or anything like that, 353 00:18:20,560 --> 00:18:21,870 it was evocative enough. 354 00:18:21,870 --> 00:18:23,970 They'd actually been on duty in Bosnia 355 00:18:23,970 --> 00:18:25,800 and it reminded them of things. 356 00:18:25,800 --> 00:18:29,160 And it really made me feel like this is something 357 00:18:29,160 --> 00:18:32,620 that I can see how this might actually work out. 358 00:18:32,620 --> 00:18:35,290 So it was a nice experience. 359 00:18:35,290 --> 00:18:36,150 Well, that's cool. 360 00:18:36,150 --> 00:18:37,560 Well, thank you very much. 361 00:18:37,560 --> 00:18:39,390 Thank you. 362 00:18:39,390 --> 00:18:40,560 Great. 363 00:18:40,560 --> 00:18:42,720 I love to bring so many different perspectives 364 00:18:42,720 --> 00:18:43,330 to this field. 365 00:18:43,330 --> 00:18:45,680 It's absolutely amazing.