1 00:00:00,000 --> 00:00:03,940 2 00:00:03,940 --> 00:00:05,980 Hello, friends and colleagues. 3 00:00:05,980 --> 00:00:08,710 I chose this as the title for my talk thinking 4 00:00:08,710 --> 00:00:10,220 of the following metaphor. 5 00:00:10,220 --> 00:00:13,390 500 million years ago photosynthetic bacteria 6 00:00:13,390 --> 00:00:15,910 in the ocean gradually changed our atmosphere to one 7 00:00:15,910 --> 00:00:18,730 of oxygen. And I said, oh, what must have happened visually? 8 00:00:18,730 --> 00:00:20,230 It must have been the case that we 9 00:00:20,230 --> 00:00:23,102 went from a murky, obscure atmosphere to one 10 00:00:23,102 --> 00:00:25,060 where you could see clearly for a long distance 11 00:00:25,060 --> 00:00:29,367 like we can today-- in most places besides Los Angeles. 12 00:00:29,367 --> 00:00:30,950 I don't think that was quite the case. 13 00:00:30,950 --> 00:00:32,299 But I think it's a good metaphor for what's 14 00:00:32,299 --> 00:00:34,940 happening with the information revolution we have now. 15 00:00:34,940 --> 00:00:38,150 We can access information easily much further away 16 00:00:38,150 --> 00:00:39,042 than we could before. 17 00:00:39,042 --> 00:00:41,000 And this transition of the atmosphere, I think, 18 00:00:41,000 --> 00:00:43,583 is sort of a nice metaphor to-- even though it's not accurate, 19 00:00:43,583 --> 00:00:45,620 maybe not on this planet. 20 00:00:45,620 --> 00:00:47,693 Keep it in mind. 21 00:00:47,693 --> 00:00:49,610 Background reading assignment for all of you-- 22 00:00:49,610 --> 00:00:51,693 if you haven't seen Enemy of the State, go see it. 23 00:00:51,693 --> 00:00:53,790 It's a wonderful movie. 24 00:00:53,790 --> 00:00:56,780 There's a lot of discussion of privacy and illustration 25 00:00:56,780 --> 00:00:59,180 of what current technology is capable of. 26 00:00:59,180 --> 00:01:01,940 One of the corrupt NSA employees there has this nice quote. 27 00:01:01,940 --> 00:01:03,050 "Privacy is an illusion. 28 00:01:03,050 --> 00:01:04,875 We haven't had any for 20 years. 29 00:01:04,875 --> 00:01:06,500 All that's left is what's in your head. 30 00:01:06,500 --> 00:01:08,840 And maybe that's enough." 31 00:01:08,840 --> 00:01:10,200 I don't think that's true. 32 00:01:10,200 --> 00:01:11,220 I don't agree with that. 33 00:01:11,220 --> 00:01:12,770 I don't think it's enough. 34 00:01:12,770 --> 00:01:16,140 But we need to work in order to preserve our privacy. 35 00:01:16,140 --> 00:01:18,422 Go see the movie. 36 00:01:18,422 --> 00:01:19,130 What's happening? 37 00:01:19,130 --> 00:01:23,390 The digital revolution reverses many of the default assumptions 38 00:01:23,390 --> 00:01:24,560 we've had. 39 00:01:24,560 --> 00:01:27,650 What was once hard to copy is now trivial to duplicate. 40 00:01:27,650 --> 00:01:30,020 And this is having grave and tremendous consequences 41 00:01:30,020 --> 00:01:32,210 for the publishing industry. 42 00:01:32,210 --> 00:01:35,690 The once-familiar friend is now hard to identify online. 43 00:01:35,690 --> 00:01:38,060 You get some email allegedly from your friend. 44 00:01:38,060 --> 00:01:39,830 Is it really from your friend? 45 00:01:39,830 --> 00:01:41,750 You're no longer so sure. 46 00:01:41,750 --> 00:01:44,985 What is once forgotten is now stored forever. 47 00:01:44,985 --> 00:01:46,860 You can look up everything you've ever posted 48 00:01:46,860 --> 00:01:49,520 on a newsgroup in Deja News. 49 00:01:49,520 --> 00:01:53,213 And what was once private is now public more and more so 50 00:01:53,213 --> 00:01:53,880 as time goes on. 51 00:01:53,880 --> 00:01:57,650 And that's the issue I'm addressing today. 52 00:01:57,650 --> 00:01:59,270 How do we visualize this problem? 53 00:01:59,270 --> 00:02:01,250 I think trying to imagine a world where 54 00:02:01,250 --> 00:02:05,060 we have 100 to 1,000 times as much information stored 55 00:02:05,060 --> 00:02:07,280 about everything we do is the place to start. 56 00:02:07,280 --> 00:02:12,530 So imagine your personal website now has 1 million automatically 57 00:02:12,530 --> 00:02:14,690 generated web pages in it documenting 58 00:02:14,690 --> 00:02:16,310 every aspect of your life. 59 00:02:16,310 --> 00:02:19,077 [INAUDIBLE] [? Bush ?] sort of had a image like that. 60 00:02:19,077 --> 00:02:21,410 And Dave Carter talked about this kind of thing earlier, 61 00:02:21,410 --> 00:02:22,010 too. 62 00:02:22,010 --> 00:02:23,695 Every artifact has a website. 63 00:02:23,695 --> 00:02:25,070 Everything that's ever been made, 64 00:02:25,070 --> 00:02:27,710 there's a little website, a web page describing it. 65 00:02:27,710 --> 00:02:30,230 Every transaction has a web page. 66 00:02:30,230 --> 00:02:32,090 Video cameras have plummeted in price 67 00:02:32,090 --> 00:02:34,247 to $19.95, sold on late night TV. 68 00:02:34,247 --> 00:02:35,330 And they're now pervasive. 69 00:02:35,330 --> 00:02:36,110 They're all over the place. 70 00:02:36,110 --> 00:02:38,180 And video feeds are coming from everywhere. 71 00:02:38,180 --> 00:02:39,230 You're always on camera. 72 00:02:39,230 --> 00:02:41,730 And the computers track where you are and what you're doing. 73 00:02:41,730 --> 00:02:44,060 74 00:02:44,060 --> 00:02:48,700 So here's a website for my toothbrush. 75 00:02:48,700 --> 00:02:51,110 It gives the owner, who it was made by, 76 00:02:51,110 --> 00:02:53,320 et cetera, et cetera, transaction information 77 00:02:53,320 --> 00:02:54,250 to current location. 78 00:02:54,250 --> 00:02:56,800 Actually, this website is supported 79 00:02:56,800 --> 00:02:58,900 by the computer that's in the toothbrush itself, 80 00:02:58,900 --> 00:03:02,830 of course, because everything has got a computer in it. 81 00:03:02,830 --> 00:03:05,310 That's one kind of information, personal information which 82 00:03:05,310 --> 00:03:07,270 you may want to control, personal artifacts. 83 00:03:07,270 --> 00:03:10,655 Transaction information-- here's a transaction web page 84 00:03:10,655 --> 00:03:12,280 saying the purchase of that toothbrush. 85 00:03:12,280 --> 00:03:13,738 So this might be information that's 86 00:03:13,738 --> 00:03:16,805 maintained both by me and by the manufacturer 87 00:03:16,805 --> 00:03:19,180 that I bought it from or the store that I bought it from, 88 00:03:19,180 --> 00:03:21,033 and other kinds of information about it. 89 00:03:21,033 --> 00:03:23,200 This is the kind of information known by two parties 90 00:03:23,200 --> 00:03:25,360 that the European Privacy Directive is 91 00:03:25,360 --> 00:03:26,380 so good at addressing. 92 00:03:26,380 --> 00:03:29,365 What are the companies allowed to preserve about various kinds 93 00:03:29,365 --> 00:03:30,490 of transaction information? 94 00:03:30,490 --> 00:03:32,448 You have, on the one hand personal information. 95 00:03:32,448 --> 00:03:33,880 Here's transaction information. 96 00:03:33,880 --> 00:03:36,930 And then we've got, maybe, this video kind of information. 97 00:03:36,930 --> 00:03:40,030 Here's the bathroom video cam saying where that toothbrush 98 00:03:40,030 --> 00:03:43,563 is, sitting on the sink. 99 00:03:43,563 --> 00:03:45,730 But you can imagine that these video cameras are not 100 00:03:45,730 --> 00:03:47,200 just in private homes, but they're going to be all around. 101 00:03:47,200 --> 00:03:48,400 We've seen this in the news already. 102 00:03:48,400 --> 00:03:49,858 You'll have cameras on the streets. 103 00:03:49,858 --> 00:03:54,320 And people that can be monitored very easily. 104 00:03:54,320 --> 00:03:56,380 So, privacy is the issue. 105 00:03:56,380 --> 00:03:59,830 Great study of privacy by Supreme Court Justices Brandeis 106 00:03:59,830 --> 00:04:01,540 and Warren in 1890-- 107 00:04:01,540 --> 00:04:05,230 "Privacy is the right to be let alone." 108 00:04:05,230 --> 00:04:07,000 And another wonderful quote-- 109 00:04:07,000 --> 00:04:08,620 "Numerous mechanical devices threaten 110 00:04:08,620 --> 00:04:10,510 to make good the prediction that was once 111 00:04:10,510 --> 00:04:14,020 whispered in the closet shall be proclaimed from the housetops." 112 00:04:14,020 --> 00:04:15,760 And if a webcam in your bathroom doesn't 113 00:04:15,760 --> 00:04:18,160 count as that kind of a thing, I mean, 114 00:04:18,160 --> 00:04:22,930 that's really exactly what they're talking about. 115 00:04:22,930 --> 00:04:25,870 More recently, we've had justices 116 00:04:25,870 --> 00:04:27,670 talking about technology that breaches 117 00:04:27,670 --> 00:04:29,170 a reasonable expectation of privacy 118 00:04:29,170 --> 00:04:32,090 violates the Fourth Amendment, requires a court order. 119 00:04:32,090 --> 00:04:34,780 So privacy is something that comes into the law a lot. 120 00:04:34,780 --> 00:04:37,300 We need to define legally what we 121 00:04:37,300 --> 00:04:39,250 mean by privacy, what a reasonable expectation 122 00:04:39,250 --> 00:04:40,550 of privacy is these days. 123 00:04:40,550 --> 00:04:43,330 I mean, that's getting harder and harder to define. 124 00:04:43,330 --> 00:04:45,700 But we need to work on the technology 125 00:04:45,700 --> 00:04:48,700 to enforce and enable a reasonable level of privacy, 126 00:04:48,700 --> 00:04:51,830 a reasonable expectation of privacy. 127 00:04:51,830 --> 00:04:54,550 So privacy is no longer the default. 128 00:04:54,550 --> 00:04:57,040 You should assume everything is public 129 00:04:57,040 --> 00:04:59,970 unless you've made some effort to make it otherwise. 130 00:04:59,970 --> 00:05:03,480 We need to work to define what security policies we want, 131 00:05:03,480 --> 00:05:07,938 who's allowed to see what, and then work to achieve those. 132 00:05:07,938 --> 00:05:09,980 We don't have the natural difficulty of accessing 133 00:05:09,980 --> 00:05:10,813 information anymore. 134 00:05:10,813 --> 00:05:14,680 What's in a file cabinet in some secretary's office, 135 00:05:14,680 --> 00:05:16,360 hard to access, has now become a website 136 00:05:16,360 --> 00:05:19,270 on the same computer in the same secretary's office. 137 00:05:19,270 --> 00:05:21,970 We need to establish artificial barriers where there were once 138 00:05:21,970 --> 00:05:24,170 natural barriers. 139 00:05:24,170 --> 00:05:26,720 So you have to decide. 140 00:05:26,720 --> 00:05:28,127 Before it was default to private. 141 00:05:28,127 --> 00:05:30,210 Now you have to decide what should remain private. 142 00:05:30,210 --> 00:05:33,465 Who should know where you are now? 143 00:05:33,465 --> 00:05:35,840 The kind of scenario that Hari Balakrishnan gave earlier, 144 00:05:35,840 --> 00:05:38,440 where you can find our colleague in Greece, was great. 145 00:05:38,440 --> 00:05:40,030 But if you've got a stalker after you, 146 00:05:40,030 --> 00:05:43,120 you may not want your current location to be known. 147 00:05:43,120 --> 00:05:47,865 Who should know what you are doing now or saying or reading? 148 00:05:47,865 --> 00:05:49,740 We can implement lots of different mechanisms 149 00:05:49,740 --> 00:05:51,570 to enforce policies one way or the other, 150 00:05:51,570 --> 00:05:53,850 but you've got to say now what should be public 151 00:05:53,850 --> 00:05:57,150 and to what extent should it be private? 152 00:05:57,150 --> 00:05:58,920 Who should know what you have purchased, 153 00:05:58,920 --> 00:06:02,040 when and where, where you bought your last toothbrush? 154 00:06:02,040 --> 00:06:03,780 Who should know your medical history? 155 00:06:03,780 --> 00:06:05,155 Clearly, that needs to be shared. 156 00:06:05,155 --> 00:06:07,860 We need to think hard about the right kinds of access 157 00:06:07,860 --> 00:06:09,240 policies for this information. 158 00:06:09,240 --> 00:06:11,310 Who should know your voting record, something 159 00:06:11,310 --> 00:06:14,850 that needs to be a bit private or whom you communicated with 160 00:06:14,850 --> 00:06:16,125 or who you are? 161 00:06:16,125 --> 00:06:18,000 We've got a panel coming up an anonymity that 162 00:06:18,000 --> 00:06:20,670 talks about this issue. 163 00:06:20,670 --> 00:06:23,520 First director of LCS, Bob Fano, wrote an article 164 00:06:23,520 --> 00:06:27,000 in Scientific American back in 1967 which remains true today. 165 00:06:27,000 --> 00:06:28,500 "You should have the right to decide 166 00:06:28,500 --> 00:06:30,660 what is to remain private." 167 00:06:30,660 --> 00:06:33,210 I think that's an important fundamental principle 168 00:06:33,210 --> 00:06:33,918 of privacy. 169 00:06:33,918 --> 00:06:36,210 But that means you've got to put the work in and decide 170 00:06:36,210 --> 00:06:39,580 and think about these questions. 171 00:06:39,580 --> 00:06:42,970 In oxygen, we imagine that the technology 172 00:06:42,970 --> 00:06:44,590 for achieving the goals, once you've 173 00:06:44,590 --> 00:06:46,340 decided what you want to do, will be based 174 00:06:46,340 --> 00:06:47,620 on public-key cryptography. 175 00:06:47,620 --> 00:06:49,450 That's really the only workable technology, 176 00:06:49,450 --> 00:06:52,180 the only flexible authentication and encryption technology 177 00:06:52,180 --> 00:06:54,220 in a large-scale system. 178 00:06:54,220 --> 00:06:57,100 Each individual or device will have public keys. 179 00:06:57,100 --> 00:06:59,950 You might have more than one if you're acting in several roles. 180 00:06:59,950 --> 00:07:01,617 You might have a public key for yourself 181 00:07:01,617 --> 00:07:04,690 as a private individual and another key 182 00:07:04,690 --> 00:07:07,060 for your business role. 183 00:07:07,060 --> 00:07:09,970 We may get to the point where messages that are not digitally 184 00:07:09,970 --> 00:07:12,070 signed or authenticated cryptographically 185 00:07:12,070 --> 00:07:14,020 are merely considered trash and thrown away. 186 00:07:14,020 --> 00:07:16,540 This junk that comes in over the email from who 187 00:07:16,540 --> 00:07:18,310 knows whom, we just toss out. 188 00:07:18,310 --> 00:07:20,890 189 00:07:20,890 --> 00:07:22,990 Your internet identity really then 190 00:07:22,990 --> 00:07:25,630 becomes tied to your public key that verifies 191 00:07:25,630 --> 00:07:26,840 the signatures you create. 192 00:07:26,840 --> 00:07:29,713 And so the public key, being in some way, something 193 00:07:29,713 --> 00:07:31,630 you own and control, it's almost the other way 194 00:07:31,630 --> 00:07:33,550 around-- the public key defines the identity. 195 00:07:33,550 --> 00:07:37,420 And you're sort of the physical appendage to it. 196 00:07:37,420 --> 00:07:40,240 So the identities are to become the public keys in cyberspace. 197 00:07:40,240 --> 00:07:43,060 Freshly-made public keys can be used as pseudonyms. 198 00:07:43,060 --> 00:07:45,780 You can make up a new identity by creating a new public key 199 00:07:45,780 --> 00:07:47,280 and starting to sign things with it. 200 00:07:47,280 --> 00:07:50,170 And that starts creating a record of statements 201 00:07:50,170 --> 00:07:52,540 made by that new entity. 202 00:07:52,540 --> 00:07:54,370 No one knows you, as a physical person, 203 00:07:54,370 --> 00:07:56,542 controls that pseudonym. 204 00:07:56,542 --> 00:07:58,000 And using the pseudonym, of course, 205 00:07:58,000 --> 00:08:00,250 doesn't allow other people to sign for that pseudonym. 206 00:08:00,250 --> 00:08:02,827 That's the nice thing about public-key cryptography, 207 00:08:02,827 --> 00:08:05,410 is you can sign things without giving other people the ability 208 00:08:05,410 --> 00:08:06,670 to sign for you. 209 00:08:06,670 --> 00:08:08,170 And no one can link your pseudonyms. 210 00:08:08,170 --> 00:08:10,365 Only you can transfer information between them. 211 00:08:10,365 --> 00:08:11,740 There's actually several students 212 00:08:11,740 --> 00:08:13,720 at MIT currently working on pseudonym systems 213 00:08:13,720 --> 00:08:15,595 and the wonderful things you can do with them 214 00:08:15,595 --> 00:08:17,100 and developing fantastic mathematics 215 00:08:17,100 --> 00:08:18,760 for these kinds of things that I don't have the time 216 00:08:18,760 --> 00:08:19,260 to get into. 217 00:08:19,260 --> 00:08:20,520 But it's a good research area. 218 00:08:20,520 --> 00:08:23,132 219 00:08:23,132 --> 00:08:24,340 So you've got the keys there. 220 00:08:24,340 --> 00:08:26,410 Then access control needs to be specified. 221 00:08:26,410 --> 00:08:28,450 Somehow, you need to put on a web page 222 00:08:28,450 --> 00:08:30,400 or on a piece of information to tag it 223 00:08:30,400 --> 00:08:34,030 with the public keys that are allowed to sign requests 224 00:08:34,030 --> 00:08:35,549 for access to that information. 225 00:08:35,549 --> 00:08:37,299 So the key is, in some sense, their access 226 00:08:37,299 --> 00:08:38,630 to the information. 227 00:08:38,630 --> 00:08:41,559 So each information object will have an access control 228 00:08:41,559 --> 00:08:43,940 list saying which keys are allowed 229 00:08:43,940 --> 00:08:46,360 to read it or work with it. 230 00:08:46,360 --> 00:08:49,690 We have some ongoing research in the lab for computer science 231 00:08:49,690 --> 00:08:52,802 on public-key infrastructures addressing precisely 232 00:08:52,802 --> 00:08:54,010 that question of ease of use. 233 00:08:54,010 --> 00:08:55,593 The [? SPOOKY, ?] [? SUDSY ?] systems, 234 00:08:55,593 --> 00:08:58,145 if you know the acronyms, are addressed towards this question 235 00:08:58,145 --> 00:09:00,520 of trying to make that easy because once you answer these 236 00:09:00,520 --> 00:09:02,680 questions of who's allowed to know where you are, 237 00:09:02,680 --> 00:09:04,722 you've got to somehow convey that to the computer 238 00:09:04,722 --> 00:09:05,950 in an intelligible way. 239 00:09:05,950 --> 00:09:09,555 And that needs to be easy to do if we're all gonna do it. 240 00:09:09,555 --> 00:09:11,680 Part of the [? SUDSY ?] system that we've developed 241 00:09:11,680 --> 00:09:15,340 has a distributive naming system where every key, in essence, 242 00:09:15,340 --> 00:09:17,920 becomes its own naming authority and creates certificates 243 00:09:17,920 --> 00:09:19,540 defining groups and subgroups of names 244 00:09:19,540 --> 00:09:22,678 to be used in your security policy specifications. 245 00:09:22,678 --> 00:09:25,750 246 00:09:25,750 --> 00:09:27,707 And finally, a couple of issues that 247 00:09:27,707 --> 00:09:29,290 come up in oxygen-- intentional naming 248 00:09:29,290 --> 00:09:31,100 was mentioned by Hari Balakrishnan. 249 00:09:31,100 --> 00:09:33,100 There's real security concerns I have about that 250 00:09:33,100 --> 00:09:34,360 if it's not done carefully. 251 00:09:34,360 --> 00:09:35,740 But I think we can handle it. 252 00:09:35,740 --> 00:09:38,260 When a device says, I'm this kind of a device, 253 00:09:38,260 --> 00:09:40,270 why should you believe it? 254 00:09:40,270 --> 00:09:42,880 It says, I'm a fiber optic cable. 255 00:09:42,880 --> 00:09:44,223 It's broadcasting its service. 256 00:09:44,223 --> 00:09:46,390 If you're trying to get into a system and attack it, 257 00:09:46,390 --> 00:09:48,473 you'll have people spoofing things and so on, too. 258 00:09:48,473 --> 00:09:50,223 So we have the same issue that actually we 259 00:09:50,223 --> 00:09:51,190 have now with applets. 260 00:09:51,190 --> 00:09:53,290 When you download an applet, why should you trust it or believe 261 00:09:53,290 --> 00:09:54,222 that it's-- 262 00:09:54,222 --> 00:09:55,450 oh, it's digitally signed. 263 00:09:55,450 --> 00:09:57,880 So these things will probably be digitally signed as well. 264 00:09:57,880 --> 00:10:00,620 265 00:10:00,620 --> 00:10:03,190 And finally, we have in oxygen the question 266 00:10:03,190 --> 00:10:04,642 of setting up a collab region that 267 00:10:04,642 --> 00:10:06,100 was talked about in Hari's example, 268 00:10:06,100 --> 00:10:07,600 again, of getting a group of people 269 00:10:07,600 --> 00:10:09,280 together and setting up a private-- 270 00:10:09,280 --> 00:10:11,260 essentially a virtual private network 271 00:10:11,260 --> 00:10:13,480 between them with their names being 272 00:10:13,480 --> 00:10:15,580 used to identify their keys, and then 273 00:10:15,580 --> 00:10:18,130 the keys being used to set up a symmetric key that's 274 00:10:18,130 --> 00:10:21,290 used to encrypt all of the communication between them. 275 00:10:21,290 --> 00:10:23,650 So a collab region, as a fundamental notion of oxygen 276 00:10:23,650 --> 00:10:25,150 here, is actually something we think 277 00:10:25,150 --> 00:10:26,870 we know how to do pretty well. 278 00:10:26,870 --> 00:10:28,780 This is one of the easier problems to solve. 279 00:10:28,780 --> 00:10:30,447 The harder ones are privacy and allowing 280 00:10:30,447 --> 00:10:33,168 the support for the kind of access control you want. 281 00:10:33,168 --> 00:10:35,710 So setting up a collab region-- you still sign your messages, 282 00:10:35,710 --> 00:10:37,710 encrypt them with the group key, and so on, too. 283 00:10:37,710 --> 00:10:40,252 For large groups, we have more elaborate things. 284 00:10:40,252 --> 00:10:41,710 So we're moving into this new world 285 00:10:41,710 --> 00:10:43,808 where things have changed. 286 00:10:43,808 --> 00:10:45,100 All the defaults are different. 287 00:10:45,100 --> 00:10:46,120 We have to work. 288 00:10:46,120 --> 00:10:47,170 And we're getting a lot of benefit 289 00:10:47,170 --> 00:10:48,220 out of this digital revolution. 290 00:10:48,220 --> 00:10:49,303 Things are easy to access. 291 00:10:49,303 --> 00:10:50,562 But we pay a price. 292 00:10:50,562 --> 00:10:52,270 And the price is that we have to work now 293 00:10:52,270 --> 00:10:54,430 to say what it is that shouldn't, all of a sudden, 294 00:10:54,430 --> 00:10:55,390 become easy. 295 00:10:55,390 --> 00:10:57,430 Where should we set up the artificial barriers, 296 00:10:57,430 --> 00:10:59,680 the cryptographic barriers, or other kinds of barriers 297 00:10:59,680 --> 00:11:01,503 against the access of information 298 00:11:01,503 --> 00:11:02,920 in order to preserve for ourselves 299 00:11:02,920 --> 00:11:04,588 the privacy we need to have? 300 00:11:04,588 --> 00:11:05,380 So I'll stop there. 301 00:11:05,380 --> 00:11:06,755 And thank you for your attention. 302 00:11:06,755 --> 00:11:17,120 303 00:11:17,120 --> 00:11:20,030 Well, privacy and anonymity are, as the point 304 00:11:20,030 --> 00:11:22,460 has just been made, not quite the same thing. 305 00:11:22,460 --> 00:11:23,960 But it's certainly a good lead-in. 306 00:11:23,960 --> 00:11:26,960 We're going to soon discuss, in a panel 307 00:11:26,960 --> 00:11:30,620 right here, whether the anonymous remailer 308 00:11:30,620 --> 00:11:32,580 that LCS maintains on the internet, 309 00:11:32,580 --> 00:11:36,090 whether that remailer should be shut down. 310 00:11:36,090 --> 00:11:40,130 So please be sure to come back for that panel discussion. 311 00:11:40,130 --> 00:11:42,800 We're going to take a little break just before it. 312 00:11:42,800 --> 00:11:44,840 Why don't you all please come back here 313 00:11:44,840 --> 00:11:46,460 by 5 minutes after 4:00? 314 00:11:46,460 --> 00:11:47,780 That'd be a 15-minute break. 315 00:11:47,780 --> 00:11:49,780 Thanks very much. 316 00:11:49,780 --> 00:12:20,344