1 00:00:00,000 --> 00:00:01,670 2 00:00:01,670 --> 00:00:02,890 URS GASSER: All right. 3 00:00:02,890 --> 00:00:04,690 Hello, everyone and welcome. 4 00:00:04,690 --> 00:00:06,450 How are you? 5 00:00:06,450 --> 00:00:07,140 Let's try again. 6 00:00:07,140 --> 00:00:08,490 How are you this morning? 7 00:00:08,490 --> 00:00:10,765 [APPLAUSE] 8 00:00:10,765 --> 00:00:15,060 9 00:00:15,060 --> 00:00:17,970 A very warm welcome to the Berkman Klein Center, 10 00:00:17,970 --> 00:00:20,790 to Harvard Law School on this sunny 11 00:00:20,790 --> 00:00:23,790 but windy and slightly chilly day. 12 00:00:23,790 --> 00:00:26,100 We're really delighted to have you here. 13 00:00:26,100 --> 00:00:27,330 My name is Urs Gasser. 14 00:00:27,330 --> 00:00:29,040 I serve as the executive director 15 00:00:29,040 --> 00:00:30,810 of the Berkman Klein Center. 16 00:00:30,810 --> 00:00:32,340 I'm also on the Harvard Law School 17 00:00:32,340 --> 00:00:36,180 faculty with my colleague, that I will introduce just in a bit. 18 00:00:36,180 --> 00:00:39,960 And of course, I am very, very pleased 19 00:00:39,960 --> 00:00:44,970 to moderate this special conversation about the ethics 20 00:00:44,970 --> 00:00:47,290 of digital transformation. 21 00:00:47,290 --> 00:00:50,490 And I'm of course, even more pleased and actually 22 00:00:50,490 --> 00:00:54,930 honored to welcome the federal president of Germany, 23 00:00:54,930 --> 00:00:56,590 Frank-Walter Steinmeier. 24 00:00:56,590 --> 00:00:57,090 [APPLAUSE] 25 00:00:57,090 --> 00:00:58,382 FRANK-WALTER STEINMEIER: Hello. 26 00:00:58,382 --> 00:01:06,350 27 00:01:06,350 --> 00:01:08,020 URS GASSER: [SPEAKING GERMAN] 28 00:01:08,020 --> 00:01:08,945 29 00:01:08,945 --> 00:01:10,733 FRANK-WALTER STEINMEIER: [INAUDIBLE].. 30 00:01:10,733 --> 00:01:12,650 URS GASSER: We are joined by a wonderful group 31 00:01:12,650 --> 00:01:14,480 of colleagues and experts. 32 00:01:14,480 --> 00:01:18,210 And I will-- if you are not mad with me, 33 00:01:18,210 --> 00:01:20,280 I will just briefly introduce you, 34 00:01:20,280 --> 00:01:22,610 and we will get to know each other a bit better as we 35 00:01:22,610 --> 00:01:26,870 go along and talk about your work, and of course 36 00:01:26,870 --> 00:01:29,180 as we engage in the opening conversations. 37 00:01:29,180 --> 00:01:32,510 So this should be interactive as much as we can. 38 00:01:32,510 --> 00:01:38,000 Eva Weber-Gurska is a ethicist, a philosopher currently 39 00:01:38,000 --> 00:01:40,970 at Ruhr University in Bochum. 40 00:01:40,970 --> 00:01:45,200 She's doing amazing work, and I'm already looking forward 41 00:01:45,200 --> 00:01:47,510 to learning from you today. 42 00:01:47,510 --> 00:01:49,610 Quite often, these debates about ethics 43 00:01:49,610 --> 00:01:53,050 happen without having philosophers around us. 44 00:01:53,050 --> 00:01:55,250 I'm grateful that you're here. 45 00:01:55,250 --> 00:02:00,260 Matthew Liao is at NYU, and is a professor of bioethics. 46 00:02:00,260 --> 00:02:03,220 He also runs a center on the same topic, 47 00:02:03,220 --> 00:02:07,280 and I'm particularly curious to hear also 48 00:02:07,280 --> 00:02:09,620 some of the lessons learned from past cycles 49 00:02:09,620 --> 00:02:11,720 of technological innovation, as we now 50 00:02:11,720 --> 00:02:16,360 talk about digital things and AI, IT, and the like. 51 00:02:16,360 --> 00:02:18,650 Jeanette Hofmann, welcome back. 52 00:02:18,650 --> 00:02:20,210 Great to have you here. 53 00:02:20,210 --> 00:02:25,190 Jeanette is a professor of internet policy at the Free 54 00:02:25,190 --> 00:02:27,680 University in Berlin. 55 00:02:27,680 --> 00:02:32,120 She's also the director of the Alexander von Humboldt 56 00:02:32,120 --> 00:02:36,412 Institute for Internet and Society in Berlin. 57 00:02:36,412 --> 00:02:37,995 I've introduced the president already, 58 00:02:37,995 --> 00:02:41,490 and he doesn't need an introduction. 59 00:02:41,490 --> 00:02:44,740 So next we have Dean Melissa Nobles, 60 00:02:44,740 --> 00:02:47,780 who is the Dean of the School of Humanities, Arts, 61 00:02:47,780 --> 00:02:50,930 and Social Sciences at MIT. 62 00:02:50,930 --> 00:02:55,990 Really great to have you here, as we will hear more about MIT 63 00:02:55,990 --> 00:02:58,210 is building a new school of computing. 64 00:02:58,210 --> 00:02:59,210 MELISSA NOBLES: Correct. 65 00:02:59,210 --> 00:03:00,380 URS GASSER: The Schwarzman College. 66 00:03:00,380 --> 00:03:00,890 MELISSA NOBLES: That's right. 67 00:03:00,890 --> 00:03:02,473 URS GASSER: Lots of interesting things 68 00:03:02,473 --> 00:03:03,950 happening there at the intersection 69 00:03:03,950 --> 00:03:06,550 of engineering and ethics. 70 00:03:06,550 --> 00:03:10,340 So we're looking forward to your thoughts on this conversation. 71 00:03:10,340 --> 00:03:12,170 Wolfgang Schulz, professor for media 72 00:03:12,170 --> 00:03:14,900 and public law at the University of Hamburg. 73 00:03:14,900 --> 00:03:17,270 He's also the director-- and now I have to read that, 74 00:03:17,270 --> 00:03:19,610 because I still cannot remember it-- 75 00:03:19,610 --> 00:03:23,660 the director of the Leibnitz Institute for Media Research, 76 00:03:23,660 --> 00:03:26,540 which is known to me as the Hans-Bredow-Institut, 77 00:03:26,540 --> 00:03:29,360 but I learned that it's important to emphasize 78 00:03:29,360 --> 00:03:30,800 the Leibnitz part. 79 00:03:30,800 --> 00:03:34,340 Crystal Yang, really great faculty colleague here, 80 00:03:34,340 --> 00:03:37,460 professor of law at Harvard Law School. 81 00:03:37,460 --> 00:03:41,510 Does wonderful work, important work on criminal justice 82 00:03:41,510 --> 00:03:44,160 and the use of algorithms and data in that area. 83 00:03:44,160 --> 00:03:45,750 We'll talk more about that. 84 00:03:45,750 --> 00:03:48,950 So as you can see, a fantastic lineup. 85 00:03:48,950 --> 00:03:53,300 And of course, I'm so grateful to you, Mr. President, 86 00:03:53,300 --> 00:03:56,390 that you joined this group as a participant, 87 00:03:56,390 --> 00:03:59,540 and I get a sense already that you're ready to jump in 88 00:03:59,540 --> 00:04:01,670 and will take over the moderation 89 00:04:01,670 --> 00:04:04,250 function in due course, which is totally fine 90 00:04:04,250 --> 00:04:08,520 and will make my job easier. 91 00:04:08,520 --> 00:04:10,790 So one or two logistical notes. 92 00:04:10,790 --> 00:04:14,750 First, we will end at roughly 11:30. 93 00:04:14,750 --> 00:04:15,560 That's the plan. 94 00:04:15,560 --> 00:04:19,190 Some seconds of the conversation may be in German. 95 00:04:19,190 --> 00:04:22,190 You should have translation. 96 00:04:22,190 --> 00:04:24,620 If it's OK with you, I will continue 97 00:04:24,620 --> 00:04:26,840 to moderate in English, and I think 98 00:04:26,840 --> 00:04:29,570 the reason is straightforward, because my Swiss accent is 99 00:04:29,570 --> 00:04:32,330 so strong when I speak German, that it's 100 00:04:32,330 --> 00:04:36,280 easier for the Germans when I talk English, so 101 00:04:36,280 --> 00:04:38,270 just to make that [? clear. ?] 102 00:04:38,270 --> 00:04:40,530 [APPLAUSE] 103 00:04:40,530 --> 00:04:43,480 104 00:04:43,480 --> 00:04:47,110 So with that, Herr Bundespresident, 105 00:04:47,110 --> 00:04:50,010 here's the question for you to start us off. 106 00:04:50,010 --> 00:04:55,890 We met the last time in 2012 in Berlin, and had a conversation 107 00:04:55,890 --> 00:04:59,250 about what does it mean to make good policies 108 00:04:59,250 --> 00:05:01,650 for the internet age? 109 00:05:01,650 --> 00:05:05,120 And I googled this morning actually, 110 00:05:05,120 --> 00:05:08,610 and tried to remember what happened in 2012, right? 111 00:05:08,610 --> 00:05:12,300 It seems like in internet times it's more like 70 years 112 00:05:12,300 --> 00:05:13,342 ago than seven years ago. 113 00:05:13,342 --> 00:05:14,592 FRANK-WALTER STEINMEIER: Yeah. 114 00:05:14,592 --> 00:05:17,070 URS GASSER: And when I googled things that happened there 115 00:05:17,070 --> 00:05:21,090 in the technology space was the Google Glass project 116 00:05:21,090 --> 00:05:22,710 was kicked off. 117 00:05:22,710 --> 00:05:25,200 The iPad Mini was introduced. 118 00:05:25,200 --> 00:05:27,590 Facebook went public. 119 00:05:27,590 --> 00:05:30,360 And a bill was signed in California 120 00:05:30,360 --> 00:05:35,400 that self-driving cars are now allowed and regulated. 121 00:05:35,400 --> 00:05:38,640 So I'm wondering, that seems to be a very different stage 122 00:05:38,640 --> 00:05:41,130 in our digital transformation process, 123 00:05:41,130 --> 00:05:43,600 if you look back only a few years and now fast forward 124 00:05:43,600 --> 00:05:45,450 to 2019. 125 00:05:45,450 --> 00:05:46,590 Where have we arrived? 126 00:05:46,590 --> 00:05:48,610 What are you thinking about? 127 00:05:48,610 --> 00:05:50,080 What are your concerns? 128 00:05:50,080 --> 00:05:53,010 What are your hopes and how does that connect 129 00:05:53,010 --> 00:05:54,918 with the topic of today? 130 00:05:54,918 --> 00:05:56,120 Mr. President. 131 00:05:56,120 --> 00:05:57,870 FRANK-WALTER STEINMEIER: [SPEAKING GERMAN] 132 00:05:57,870 --> 00:05:59,745 INTERPRETER: Well, thank you very much indeed 133 00:05:59,745 --> 00:06:01,680 for these kind words of welcome. 134 00:06:01,680 --> 00:06:03,660 Ladies and gentlemen, dear students, 135 00:06:03,660 --> 00:06:05,880 I think it is fantastic. 136 00:06:05,880 --> 00:06:08,130 You know, you've been given the alternative 137 00:06:08,130 --> 00:06:13,330 to enjoy a sunshiny day, though somewhat windy, late autumn, 138 00:06:13,330 --> 00:06:15,690 but you've nevertheless decided in favor 139 00:06:15,690 --> 00:06:18,420 of the alternative-- coming here into an enclosed room 140 00:06:18,420 --> 00:06:19,170 to listen to us. 141 00:06:19,170 --> 00:06:20,490 Thank you very much for that. 142 00:06:20,490 --> 00:06:24,030 And thank you also for reminding me of the year 2012, 143 00:06:24,030 --> 00:06:27,060 which I remember well for quite different reasons, 144 00:06:27,060 --> 00:06:29,050 and I'll get back to that in a second. 145 00:06:29,050 --> 00:06:32,580 You know, my visit to Boston in [? 1220. ?] 146 00:06:32,580 --> 00:06:36,300 But allow me to begin by saying that I haven't been here 147 00:06:36,300 --> 00:06:38,220 for the very first time, and I'm always 148 00:06:38,220 --> 00:06:43,920 happy to be back in this academic, scientific center, 149 00:06:43,920 --> 00:06:47,130 a center not only with regard to the United States of America 150 00:06:47,130 --> 00:06:50,970 but also in a much broader sense, because it 151 00:06:50,970 --> 00:06:55,470 is a center that is exemplary in bringing together researchers 152 00:06:55,470 --> 00:06:58,290 and academics from all parts of the world, 153 00:06:58,290 --> 00:07:01,710 from all countries of the world, to make them work 154 00:07:01,710 --> 00:07:03,930 on subjects of common concern. 155 00:07:03,930 --> 00:07:08,040 And when I remember that visit in 2012, or earlier visits 156 00:07:08,040 --> 00:07:09,810 too and later visits, you know, no matter 157 00:07:09,810 --> 00:07:13,310 whether we talked about foreign policy issues or other issues, 158 00:07:13,310 --> 00:07:15,240 either here in the hall or in other places 159 00:07:15,240 --> 00:07:18,420 in Harvard, whether we talked about questions to do 160 00:07:18,420 --> 00:07:21,720 with climate policy, about the state of affairs 161 00:07:21,720 --> 00:07:24,360 of transatlantic relations, rest assured, 162 00:07:24,360 --> 00:07:28,740 every time that I came here, I returned home 163 00:07:28,740 --> 00:07:31,230 having benefited to a large extent 164 00:07:31,230 --> 00:07:34,480 from the discussions I had at Harvard. 165 00:07:34,480 --> 00:07:38,550 There was one exception, and that brings me back to 2012, 166 00:07:38,550 --> 00:07:39,060 really. 167 00:07:39,060 --> 00:07:46,980 Just once did I risk myself, my whole existence 168 00:07:46,980 --> 00:07:48,810 here in Harvard, because I allowed myself 169 00:07:48,810 --> 00:07:53,785 to be talked into throwing the first pitch in a baseball 170 00:07:53,785 --> 00:07:54,285 match. 171 00:07:54,285 --> 00:07:57,930 172 00:07:57,930 --> 00:08:00,480 And I was extremely naive. 173 00:08:00,480 --> 00:08:04,410 Never ever had I before I attended a baseball match, 174 00:08:04,410 --> 00:08:09,300 held a baseball in my hands, nor been in a baseball stadium. 175 00:08:09,300 --> 00:08:13,720 And my then colleague, secretary of state Condoleezza Rice, 176 00:08:13,720 --> 00:08:16,950 we all remember her, was aghast when she 177 00:08:16,950 --> 00:08:19,720 heard what I was about to do. 178 00:08:19,720 --> 00:08:23,310 And the only comment she gave to me was, "don't do it." 179 00:08:23,310 --> 00:08:26,610 But you know, that is typical of us Germans. 180 00:08:26,610 --> 00:08:30,730 I had to accepted and I didn't want to go back on my promise. 181 00:08:30,730 --> 00:08:38,460 So once we entered the stadium on the afternoon of that day, 182 00:08:38,460 --> 00:08:43,320 I got an inkling of why my colleague Condoleezza 183 00:08:43,320 --> 00:08:46,800 Rice was so aghast, because the stadium was filled 184 00:08:46,800 --> 00:08:51,930 to the very last seat, 40,000 or 50,000 people in the audience. 185 00:08:51,930 --> 00:08:55,470 And I had a certain feeling that they hadn't come just 186 00:08:55,470 --> 00:08:59,010 because of me, but they'd come because they 187 00:08:59,010 --> 00:09:02,340 wanted to watch the match, a match that 188 00:09:02,340 --> 00:09:04,440 was the match really. 189 00:09:04,440 --> 00:09:06,540 Here in the United States, the Red Sox 190 00:09:06,540 --> 00:09:10,300 were playing the New York Yankees. 191 00:09:10,300 --> 00:09:13,560 And I realized all of a sudden that this is not just 192 00:09:13,560 --> 00:09:14,280 any match. 193 00:09:14,280 --> 00:09:18,690 It's about religious issues, really. 194 00:09:18,690 --> 00:09:22,260 Still, it worked out somehow. 195 00:09:22,260 --> 00:09:24,000 I survived it. 196 00:09:24,000 --> 00:09:25,650 And having survived that experience, 197 00:09:25,650 --> 00:09:27,600 I was happy to come back every time. 198 00:09:27,600 --> 00:09:30,920 I wasn't shy of returning to Boston, to Harvard. 199 00:09:30,920 --> 00:09:32,670 But today, it's a different topic, really, 200 00:09:32,670 --> 00:09:36,090 that brings me here, different from the topics we focused on 201 00:09:36,090 --> 00:09:38,810 in the past years. 202 00:09:38,810 --> 00:09:42,240 We're no longer on the threshold of the digitization, 203 00:09:42,240 --> 00:09:44,360 of digitization, of the digital age. 204 00:09:44,360 --> 00:09:47,340 But we have already entered that age. 205 00:09:47,340 --> 00:09:53,880 I've come here because the topic we will be talking about 206 00:09:53,880 --> 00:09:55,740 directly refers back to topics I'm 207 00:09:55,740 --> 00:09:58,260 focusing on in my presidency, the future 208 00:09:58,260 --> 00:10:00,060 of liberal democracy, that is. 209 00:10:00,060 --> 00:10:05,300 How does the internet, how do Facebook, Twitter, algorithms, 210 00:10:05,300 --> 00:10:10,580 anonymity in the internet, how do all these things change 211 00:10:10,580 --> 00:10:13,490 the democratic culture of debate, which 212 00:10:13,490 --> 00:10:16,220 is of such great importance to us in Germany, 213 00:10:16,220 --> 00:10:20,030 just as much as you do in the United States of America? 214 00:10:20,030 --> 00:10:23,630 Despite the daily waves of outrage 215 00:10:23,630 --> 00:10:26,540 that you have to live with, how can we 216 00:10:26,540 --> 00:10:30,200 make sure that we keep a general overview? 217 00:10:30,200 --> 00:10:33,320 How can we distinguish what is important from what 218 00:10:33,320 --> 00:10:35,360 is unimportant? 219 00:10:35,360 --> 00:10:38,390 And does this culture of thinking 220 00:10:38,390 --> 00:10:42,110 in simple opposites, yes or no, black or white, 221 00:10:42,110 --> 00:10:44,810 harsh approaches, whether that takes away from us 222 00:10:44,810 --> 00:10:51,140 our ability to see the nuances between black and white? 223 00:10:51,140 --> 00:10:52,910 Are we capable of doing that? 224 00:10:52,910 --> 00:10:57,050 Do we continue to be capable of entering into compromise, which 225 00:10:57,050 --> 00:11:00,530 I believe to be vital for any democracy, 226 00:11:00,530 --> 00:11:04,640 if we no longer have the time to differentiate or to see things 227 00:11:04,640 --> 00:11:07,700 in nuances in carefully weighing the pros and cons, 228 00:11:07,700 --> 00:11:10,670 because it's no longer popular? 229 00:11:10,670 --> 00:11:12,620 We talked about this yesterday in Boston 230 00:11:12,620 --> 00:11:17,180 with American and German academics in great depth. 231 00:11:17,180 --> 00:11:20,750 Today, though, we are again talking 232 00:11:20,750 --> 00:11:23,240 about digital transformation, how 233 00:11:23,240 --> 00:11:26,960 that has changed our lives and daily experiences. 234 00:11:26,960 --> 00:11:29,480 But as Mr. Gasser kindly indicated, 235 00:11:29,480 --> 00:11:32,500 we will be focusing on a different priority. 236 00:11:32,500 --> 00:11:34,250 We're not really focusing on the question, 237 00:11:34,250 --> 00:11:36,650 whether we need digital technologies. 238 00:11:36,650 --> 00:11:38,180 They're there, anyway. 239 00:11:38,180 --> 00:11:40,400 No one is denying the fact that they open up 240 00:11:40,400 --> 00:11:43,100 enormous opportunities for all of us, when 241 00:11:43,100 --> 00:11:45,100 it comes to fighting poverty, for example, 242 00:11:45,100 --> 00:11:49,070 when it comes to tackling the impact of climate change, when 243 00:11:49,070 --> 00:11:53,360 it comes to combating diseases and their effects. 244 00:11:53,360 --> 00:11:55,340 Undoubtedly, Germany is a country 245 00:11:55,340 --> 00:11:59,480 that has no resources over its own, 246 00:11:59,480 --> 00:12:02,900 outside the humanities and human resources. 247 00:12:02,900 --> 00:12:07,130 We want to be a country that has technology to offer, 248 00:12:07,130 --> 00:12:11,180 and we want to participate in the developments they entail. 249 00:12:11,180 --> 00:12:14,450 And that is a kind of introductory remark on my part. 250 00:12:14,450 --> 00:12:15,860 As regards to the topic we intend 251 00:12:15,860 --> 00:12:20,120 to talk about today the ethic-- a code of ethics 252 00:12:20,120 --> 00:12:22,250 for the digital transformation, I 253 00:12:22,250 --> 00:12:27,730 would like to just briefly focus on 254 00:12:27,730 --> 00:12:30,380 why this topic is so important to me. 255 00:12:30,380 --> 00:12:33,260 I actually have come from two visits, 256 00:12:33,260 --> 00:12:35,270 and I refer back to those visits. 257 00:12:35,270 --> 00:12:39,110 I visited Stanford last year, focusing again 258 00:12:39,110 --> 00:12:40,700 on the future of digitization. 259 00:12:40,700 --> 00:12:43,730 When we traveled there, a few days before we left we read 260 00:12:43,730 --> 00:12:50,660 in the papers that Elon Musk had bought up a company that was 261 00:12:50,660 --> 00:12:53,600 engaging in the research in brain implants, 262 00:12:53,600 --> 00:12:59,460 and that was doing very well in that regard, 263 00:12:59,460 --> 00:13:03,380 that this might help tackle diseases like 264 00:13:03,380 --> 00:13:08,180 Parkinson's and Alzheimer's. 265 00:13:08,180 --> 00:13:11,070 I learned a lot about the imagination of researchers 266 00:13:11,070 --> 00:13:13,640 during my encounters there, how one 267 00:13:13,640 --> 00:13:18,740 can influence brain activities with the help 268 00:13:18,740 --> 00:13:22,190 of implants and algorithms. 269 00:13:22,190 --> 00:13:26,660 This has undeniable and obvious consequences. 270 00:13:26,660 --> 00:13:29,240 But at the end of the discussion, it wasn't I, 271 00:13:29,240 --> 00:13:31,620 but it was someone who is very well known in the United 272 00:13:31,620 --> 00:13:33,110 States, George Shultz. 273 00:13:33,110 --> 00:13:38,480 That is the former Secretary of State under Ronald Reagan, who 274 00:13:38,480 --> 00:13:42,410 also is or was at the time a member of the board of Stanford 275 00:13:42,410 --> 00:13:43,190 University. 276 00:13:43,190 --> 00:13:45,950 He said at the end of the discussion, guys, really, 277 00:13:45,950 --> 00:13:50,880 I am fascinated by the scenario you have been painting, 278 00:13:50,880 --> 00:13:51,900 drawing of the future. 279 00:13:51,900 --> 00:13:56,870 But let's not forget, we are living in a democracy. 280 00:13:56,870 --> 00:14:03,030 And democracy relies on independent, self-determined, 281 00:14:03,030 --> 00:14:07,617 confident human beings if it is to survive 282 00:14:07,617 --> 00:14:10,200 And he addressed himself to the researchers and the academics. 283 00:14:10,200 --> 00:14:12,710 So when developing these technologies, 284 00:14:12,710 --> 00:14:17,870 don't forget to think of the consequences of your inventions 285 00:14:17,870 --> 00:14:20,975 and how that fits into democracy and its principles. 286 00:14:20,975 --> 00:14:23,030 In my second trip, and I'm going to be 287 00:14:23,030 --> 00:14:26,870 brief about this, with my visit to China, again, 288 00:14:26,870 --> 00:14:29,460 we also [? focused ?] on this topic. 289 00:14:29,460 --> 00:14:34,670 And we also talked about social scoring, the opportunities, 290 00:14:34,670 --> 00:14:39,390 the perspectives that result for the members of a society. 291 00:14:39,390 --> 00:14:44,190 The debates we had were not easy, because at the beginning 292 00:14:44,190 --> 00:14:45,680 the Chinese didn't understand why 293 00:14:45,680 --> 00:14:47,570 we were asking these questions at all, 294 00:14:47,570 --> 00:14:50,690 and why we would find some of these things complicated 295 00:14:50,690 --> 00:14:53,000 that come up in the context of social scoring. 296 00:14:53,000 --> 00:14:56,192 Because they said, we have 80, 90% of support, popular support 297 00:14:56,192 --> 00:14:56,900 for these topics. 298 00:14:56,900 --> 00:14:59,950 Why are you against it? 299 00:14:59,950 --> 00:15:06,820 You know, we live under different political 300 00:15:06,820 --> 00:15:08,770 circumstances, are scared and shocked 301 00:15:08,770 --> 00:15:12,670 by the idea of having to submit to a total surveillance 302 00:15:12,670 --> 00:15:19,930 of all aspects of our lives, that no matter what we do, 303 00:15:19,930 --> 00:15:22,690 this might be linked up to a system 304 00:15:22,690 --> 00:15:26,620 that assesses our performance in a negative or a positive way, 305 00:15:26,620 --> 00:15:29,470 and that this, of course, has an effect on the way we developed 306 00:15:29,470 --> 00:15:31,780 as human beings, that hopes, wishes, 307 00:15:31,780 --> 00:15:34,300 and dreams are becoming externalized, 308 00:15:34,300 --> 00:15:36,760 that they are stored on a software I no 309 00:15:36,760 --> 00:15:39,910 longer have any influence over. 310 00:15:39,910 --> 00:15:45,430 For our concept of individual responsibility 311 00:15:45,430 --> 00:15:49,660 and of personal freedom is being called into question 312 00:15:49,660 --> 00:15:52,480 by such an approach. 313 00:15:52,480 --> 00:15:55,870 We, however, know that this is not a problem that 314 00:15:55,870 --> 00:15:57,770 is exclusive to the Chinese. 315 00:15:57,770 --> 00:16:01,660 German companies, American companies 316 00:16:01,660 --> 00:16:03,460 that invest in the United States, 317 00:16:03,460 --> 00:16:05,930 that employ people in the United States, 318 00:16:05,930 --> 00:16:09,040 they will be working under the very same conditions. 319 00:16:09,040 --> 00:16:10,720 And thus we have to have an interest 320 00:16:10,720 --> 00:16:12,480 in what is happening here. 321 00:16:12,480 --> 00:16:15,520 But let me close by saying that the debate in China 322 00:16:15,520 --> 00:16:18,180 hasn't yet come to an end yet. 323 00:16:18,180 --> 00:16:19,310 It is still ongoing. 324 00:16:19,310 --> 00:16:24,700 We don't know what will be the outcome, the result of all 325 00:16:24,700 --> 00:16:27,588 those tests and experiments that are being 326 00:16:27,588 --> 00:16:28,880 carried out in China right now. 327 00:16:28,880 --> 00:16:31,510 But the obvious question is on the table. 328 00:16:31,510 --> 00:16:34,090 Is there something like a minimum 329 00:16:34,090 --> 00:16:38,440 of morals for the digital age? 330 00:16:38,440 --> 00:16:42,070 Shouldn't we work to have something like that like, like 331 00:16:42,070 --> 00:16:46,550 a common expression of the limits 332 00:16:46,550 --> 00:16:52,570 of the digital future in the decades or centuries to come? 333 00:16:52,570 --> 00:16:57,190 Which brings me down to the question of whether we do not 334 00:16:57,190 --> 00:17:01,090 really need a much more intensive exchange of thoughts 335 00:17:01,090 --> 00:17:06,609 between the tech community, the political scientists 336 00:17:06,609 --> 00:17:09,970 about the philosophy of the individual than 337 00:17:09,970 --> 00:17:14,650 is happening at this point in time, at least as I see it. 338 00:17:14,650 --> 00:17:18,190 Well, you know, if I were to choose, I would very much like 339 00:17:18,190 --> 00:17:21,849 to be in a position where I could leave Boston today, 340 00:17:21,849 --> 00:17:24,280 having received the confirmation from all of you 341 00:17:24,280 --> 00:17:27,640 that I need not be afraid, that I need not be concerned, 342 00:17:27,640 --> 00:17:31,720 that the debate is taking place in the very intensity 343 00:17:31,720 --> 00:17:34,060 that I would wish to see attributed to it. 344 00:17:34,060 --> 00:17:36,580 But whether that is the case or not, 345 00:17:36,580 --> 00:17:40,520 we will have to hear and see from you. 346 00:17:40,520 --> 00:17:42,610 I very much look forward to this debate. 347 00:17:42,610 --> 00:17:44,632 [APPLAUSE] 348 00:17:44,632 --> 00:17:49,092 349 00:17:49,092 --> 00:17:50,800 URS GASSER: Thank you so much for setting 350 00:17:50,800 --> 00:17:52,240 the stage so beautifully. 351 00:17:52,240 --> 00:17:55,660 And I realized while you were speaking 352 00:17:55,660 --> 00:17:59,030 that my American colleagues didn't have translation, 353 00:17:59,030 --> 00:18:00,800 but you followed it very nicely. 354 00:18:00,800 --> 00:18:03,050 CRYSTAL YANG: And I very much appreciated the baseball 355 00:18:03,050 --> 00:18:04,050 reference. 356 00:18:04,050 --> 00:18:04,995 MELISSA NOBLES: Yeah, I got that part. 357 00:18:04,995 --> 00:18:07,050 There were a couple of words in there I kind of got. 358 00:18:07,050 --> 00:18:09,300 URS GASSER: Yes, but you really set the stage so well. 359 00:18:09,300 --> 00:18:13,075 360 00:18:13,075 --> 00:18:14,600 Yes, that may be helpful. 361 00:18:14,600 --> 00:18:16,627 MELISSA NOBLES: Great, thank you. 362 00:18:16,627 --> 00:18:18,460 FRANK-WALTER STEINMEIER: Should I repeat it? 363 00:18:18,460 --> 00:18:22,308 MELISSA NOBLES: Yeah, thank you. 364 00:18:22,308 --> 00:18:23,270 Bit of a summary. 365 00:18:23,270 --> 00:18:25,160 I heard "Stanford." 366 00:18:25,160 --> 00:18:25,800 I heard-- 367 00:18:25,800 --> 00:18:26,842 URS GASSER: Right, right. 368 00:18:26,842 --> 00:18:30,820 And of course you picked up on that, which is exactly 369 00:18:30,820 --> 00:18:32,435 the segue to my question. 370 00:18:32,435 --> 00:18:33,310 MELISSA NOBLES: Sure. 371 00:18:33,310 --> 00:18:36,280 URS GASSER: So the president was putting 372 00:18:36,280 --> 00:18:39,280 sort of the societal change that we're going through, 373 00:18:39,280 --> 00:18:41,650 where technologies of different sorts 374 00:18:41,650 --> 00:18:43,420 play such a vital role in the larger 375 00:18:43,420 --> 00:18:45,380 context of the future of democracy, 376 00:18:45,380 --> 00:18:49,390 and the question of how do we want to live our lives 377 00:18:49,390 --> 00:18:54,310 and interact with each other and shape our future. 378 00:18:54,310 --> 00:18:57,280 And within that, he also referred to, 379 00:18:57,280 --> 00:19:01,520 as you picked up on, a trip to Stanford, 380 00:19:01,520 --> 00:19:03,930 and pointed out already, and that's 381 00:19:03,930 --> 00:19:06,013 a theme I want to follow up on for a few minutes-- 382 00:19:06,013 --> 00:19:06,888 MELISSA NOBLES: Sure. 383 00:19:06,888 --> 00:19:09,440 URS GASSER: --that there are tremendous opportunities, 384 00:19:09,440 --> 00:19:11,620 although currently the focus is really 385 00:19:11,620 --> 00:19:14,410 on the risks of new technologies essentially 386 00:19:14,410 --> 00:19:18,180 in public discourse, for sure, particularly in Europe. 387 00:19:18,180 --> 00:19:21,370 But before we go into risk mode and talk 388 00:19:21,370 --> 00:19:24,580 about all the pitfalls of these new technologies, 389 00:19:24,580 --> 00:19:29,040 I would like to pause and really zoom 390 00:19:29,040 --> 00:19:32,980 in a little bit on this question, what can technology 391 00:19:32,980 --> 00:19:35,230 do for climate change and other areas 392 00:19:35,230 --> 00:19:36,610 that the president mentioned. 393 00:19:36,610 --> 00:19:38,310 FRANK-WALTER STEINMEIER: Against climate change. 394 00:19:38,310 --> 00:19:39,770 URS GASSER: Against, yes, against. 395 00:19:39,770 --> 00:19:40,600 MELISSA NOBLES: Right, right, exactly. 396 00:19:40,600 --> 00:19:43,510 URS GASSER: To address some of the big, big challenges 397 00:19:43,510 --> 00:19:44,200 of our time. 398 00:19:44,200 --> 00:19:45,075 MELISSA NOBLES: Sure. 399 00:19:45,075 --> 00:19:47,940 URS GASSER: And there is this other place closer to home, 400 00:19:47,940 --> 00:19:48,895 MIT-- 401 00:19:48,895 --> 00:19:49,240 MELISSA NOBLES: Right. 402 00:19:49,240 --> 00:19:51,250 URS GASSER: --where many of these technologies are 403 00:19:51,250 --> 00:19:53,833 developed in the lab, and I was wondering whether you would be 404 00:19:53,833 --> 00:19:57,400 willing to share maybe two, three examples from also 405 00:19:57,400 --> 00:19:58,608 your humanities perspective-- 406 00:19:58,608 --> 00:19:59,483 MELISSA NOBLES: Sure. 407 00:19:59,483 --> 00:20:02,080 URS GASSER: --that give you hope and optimism, maybe. 408 00:20:02,080 --> 00:20:03,830 MELISSA NOBLES: Sure, I'm glad to do that. 409 00:20:03,830 --> 00:20:05,080 Good morning, everyone. 410 00:20:05,080 --> 00:20:07,185 Well, you know, one of the things about MIT, I 411 00:20:07,185 --> 00:20:08,560 kind of hesitate in a certain way 412 00:20:08,560 --> 00:20:10,990 to be able to say two or three, since the Institute 413 00:20:10,990 --> 00:20:14,400 is connected to technological innovation. 414 00:20:14,400 --> 00:20:16,540 So think I'd rather say a bit about what 415 00:20:16,540 --> 00:20:20,710 has made MIT such a leader in thinking innovatively. 416 00:20:20,710 --> 00:20:23,950 And a big part of that has been the commitment to collaboration 417 00:20:23,950 --> 00:20:25,330 across all five schools. 418 00:20:25,330 --> 00:20:28,288 So it's in recognition that many of the problems that the world 419 00:20:28,288 --> 00:20:29,830 faces are obviously global in nature, 420 00:20:29,830 --> 00:20:32,950 and they require knowledge from all domains. 421 00:20:32,950 --> 00:20:35,050 It isn't just a scientific problem. 422 00:20:35,050 --> 00:20:36,910 It isn't just an engineering problem. 423 00:20:36,910 --> 00:20:40,510 It isn't just an economic problem or a social problem. 424 00:20:40,510 --> 00:20:42,530 It is all of these things together. 425 00:20:42,530 --> 00:20:46,840 And part of our strengths have been putting together research 426 00:20:46,840 --> 00:20:48,170 programs to deal with these. 427 00:20:48,170 --> 00:20:52,000 So we have, for example, the MIT Energy Initiative, 428 00:20:52,000 --> 00:20:58,240 which brings together professors from engineering, science, 429 00:20:58,240 --> 00:21:00,520 humanities, also social sciences, the Sloane School, 430 00:21:00,520 --> 00:21:02,230 to look at the economics and the business 431 00:21:02,230 --> 00:21:06,550 models of what is sustainable and what not, as well 432 00:21:06,550 --> 00:21:08,620 as architecture and planning to look 433 00:21:08,620 --> 00:21:11,020 at the ways in which climate change and the way we 434 00:21:11,020 --> 00:21:13,790 use energy is changing how we structure cities. 435 00:21:13,790 --> 00:21:16,360 So it is the scope of the problems 436 00:21:16,360 --> 00:21:19,600 and a commitment to putting intellectual energies that 437 00:21:19,600 --> 00:21:23,410 are commensurate with them that, I think, 438 00:21:23,410 --> 00:21:26,920 has set MIT in a better way for thinking about the future. 439 00:21:26,920 --> 00:21:29,770 So I hesitate to say any particular, 440 00:21:29,770 --> 00:21:34,030 except to say that the problems are so massive, 441 00:21:34,030 --> 00:21:37,870 there is no way that technology cannot be a part of it. 442 00:21:37,870 --> 00:21:40,740 And the issue is, how do we think creatively 443 00:21:40,740 --> 00:21:42,890 about technology to make sure that's happening. 444 00:21:42,890 --> 00:21:45,310 And that's a big part of what education has to do, 445 00:21:45,310 --> 00:21:47,290 to connect students to understand 446 00:21:47,290 --> 00:21:50,110 that the technology is an expression, 447 00:21:50,110 --> 00:21:51,543 is a human endeavor, right? 448 00:21:51,543 --> 00:21:52,585 We create the technology. 449 00:21:52,585 --> 00:21:54,670 The technology doesn't create us, 450 00:21:54,670 --> 00:21:56,980 and we have to start with some basic commitments. 451 00:21:56,980 --> 00:21:59,720 So that's where we are now. 452 00:21:59,720 --> 00:22:02,140 And I look forward to saying a bit more later 453 00:22:02,140 --> 00:22:03,725 on about the College of Computing. 454 00:22:03,725 --> 00:22:04,600 URS GASSER: Fabulous. 455 00:22:04,600 --> 00:22:05,410 Thank you so much. 456 00:22:05,410 --> 00:22:07,940 That's very helpful. 457 00:22:07,940 --> 00:22:10,270 Some sort of an iteration on this theme, 458 00:22:10,270 --> 00:22:14,240 and taking your point, where you argue, 459 00:22:14,240 --> 00:22:18,910 well, there's no future without getting technology right 460 00:22:18,910 --> 00:22:21,790 in a way that helps us to address some 461 00:22:21,790 --> 00:22:25,240 of these big challenges we face as humanity, 462 00:22:25,240 --> 00:22:28,780 but also to embrace the opportunities. 463 00:22:28,780 --> 00:22:32,110 And I was wondering whether you would 464 00:22:32,110 --> 00:22:35,290 be willing to share your thinking around this topic. 465 00:22:35,290 --> 00:22:37,660 Much of the ethical debates of these days 466 00:22:37,660 --> 00:22:43,360 are focused on ethics in the sense of telling 467 00:22:43,360 --> 00:22:44,740 us what not to do, right? 468 00:22:44,740 --> 00:22:46,120 What lines not to cross. 469 00:22:46,120 --> 00:22:48,393 And we will definitely return to that, 470 00:22:48,393 --> 00:22:50,185 and this will be the key part of the panel. 471 00:22:50,185 --> 00:22:54,440 But before we go there, I was wondering, 472 00:22:54,440 --> 00:22:57,430 is there some sort of an ethical obligation 473 00:22:57,430 --> 00:23:00,190 for the good use of technology? 474 00:23:00,190 --> 00:23:05,590 And basically a moral imperative that would almost 475 00:23:05,590 --> 00:23:08,440 be in contrast to the precautionary principle that's 476 00:23:08,440 --> 00:23:10,850 so popular in Europe these days, and say, 477 00:23:10,850 --> 00:23:13,900 no, we have to double down on developing technologies 478 00:23:13,900 --> 00:23:16,510 for the social good and in the public interest. 479 00:23:16,510 --> 00:23:20,310 How does a philosopher or ethicist think about that? 480 00:23:20,310 --> 00:23:23,237 EVA WEBER-GURSKA: Yeah, thank you for that question. 481 00:23:23,237 --> 00:23:24,070 I'm happy to answer. 482 00:23:24,070 --> 00:23:26,850 So I think there are at least two ways 483 00:23:26,850 --> 00:23:28,440 to understand your question. 484 00:23:28,440 --> 00:23:33,780 First, we may ask if there is a moral obligation 485 00:23:33,780 --> 00:23:36,960 to generally use digital technology now 486 00:23:36,960 --> 00:23:39,660 that it has been invented and developed up 487 00:23:39,660 --> 00:23:44,230 to a point where so much concrete applications are 488 00:23:44,230 --> 00:23:45,850 possible. 489 00:23:45,850 --> 00:23:49,070 But my answer to this would be no. 490 00:23:49,070 --> 00:23:51,210 There is no general moral obligation 491 00:23:51,210 --> 00:23:54,240 to do what can be done. 492 00:23:54,240 --> 00:23:57,320 Because digitization is just a means, 493 00:23:57,320 --> 00:24:01,290 and moral obligation refers to ends, to purposes, not the way 494 00:24:01,290 --> 00:24:02,650 we get there. 495 00:24:02,650 --> 00:24:06,030 And so it is an open question if digitization 496 00:24:06,030 --> 00:24:08,460 is the best way to get there, where we want to go, 497 00:24:08,460 --> 00:24:10,680 to our moral purpose, which is, as you already 498 00:24:10,680 --> 00:24:14,320 pointed out, good democracy, human flourishing, and so on. 499 00:24:14,320 --> 00:24:18,000 And we have just to see exactly where digitization is helpful 500 00:24:18,000 --> 00:24:19,800 and where not. 501 00:24:19,800 --> 00:24:22,020 But on the other hand, if you ask 502 00:24:22,020 --> 00:24:28,080 if we theorists, theorists like us here on the panel, 503 00:24:28,080 --> 00:24:31,980 should point out possible positive uses of technology 504 00:24:31,980 --> 00:24:35,820 more often, I would say yes, and that's important, too. 505 00:24:35,820 --> 00:24:37,620 Because otherwise, the development 506 00:24:37,620 --> 00:24:40,860 of digital technology mostly is driven 507 00:24:40,860 --> 00:24:43,710 by interests for financial profit, 508 00:24:43,710 --> 00:24:47,610 and that is not the best premise for the best 509 00:24:47,610 --> 00:24:51,070 outcome in a moral perspective, from a moral perspective. 510 00:24:51,070 --> 00:24:54,430 So it would be surely good to have more people pointing out 511 00:24:54,430 --> 00:24:58,980 the positive uses, but I think there are already 512 00:24:58,980 --> 00:25:01,620 quite examples for that, too. 513 00:25:01,620 --> 00:25:04,840 And also where reflection and realization goes together. 514 00:25:04,840 --> 00:25:06,990 For example, at the Weizenbaum Institute 515 00:25:06,990 --> 00:25:10,820 there was a fellow in summer in Berlin. 516 00:25:10,820 --> 00:25:13,740 A young colleague went and developed 517 00:25:13,740 --> 00:25:18,420 an app, which enables people from different parts 518 00:25:18,420 --> 00:25:22,200 of the political spectrum to chat and discuss it 519 00:25:22,200 --> 00:25:25,590 with each other online, for example. 520 00:25:25,590 --> 00:25:28,665 And yeah, I mean, there are a lot of opportunities, 521 00:25:28,665 --> 00:25:31,890 and we should point them out, but also I 522 00:25:31,890 --> 00:25:36,930 want to add that although these projects all 523 00:25:36,930 --> 00:25:41,115 have to be chosen carefully, because you mentioned already 524 00:25:41,115 --> 00:25:45,360 with climate change, we can do good things against climate 525 00:25:45,360 --> 00:25:47,170 change with digitalization. 526 00:25:47,170 --> 00:25:49,770 But on the other hand, we also have to be aware of the fact 527 00:25:49,770 --> 00:25:52,800 that digitization, all digital technologies 528 00:25:52,800 --> 00:25:56,430 themselves are consuming masses of energy. 529 00:25:56,430 --> 00:26:02,280 So it would be best to choose only those projects which 530 00:26:02,280 --> 00:26:04,580 have a really urgent reason. 531 00:26:04,580 --> 00:26:07,410 There has to be something important at stake 532 00:26:07,410 --> 00:26:11,170 that we invent and apply new technologies. 533 00:26:11,170 --> 00:26:14,310 And I remember the British philosopher Dave Parfit 534 00:26:14,310 --> 00:26:18,220 saying that all people, all humans with healthy, 535 00:26:18,220 --> 00:26:21,090 with two healthy legs should use the stairs instead 536 00:26:21,090 --> 00:26:24,230 of the elevator in order to save energy. 537 00:26:24,230 --> 00:26:26,930 Because he said elevators are just made for people who cannot 538 00:26:26,930 --> 00:26:27,860 walk. 539 00:26:27,860 --> 00:26:30,850 And a bit in a similar way, we should always 540 00:26:30,850 --> 00:26:35,190 watch out where are the urgent reasons that we invent 541 00:26:35,190 --> 00:26:37,080 digital technologies for. 542 00:26:37,080 --> 00:26:39,720 And what is urgent, what is important always 543 00:26:39,720 --> 00:26:41,180 depends on the domain. 544 00:26:41,180 --> 00:26:42,960 It's different in every domain. 545 00:26:42,960 --> 00:26:46,830 In medicine, for example, it's the diminishing of suffering. 546 00:26:46,830 --> 00:26:48,570 In law, it's justice. 547 00:26:48,570 --> 00:26:53,130 In a democracy, it's participation and about 548 00:26:53,130 --> 00:26:55,880 [? founded ?] formation of political opinion. 549 00:26:55,880 --> 00:26:58,920 And only then, when we have identified precise moral 550 00:26:58,920 --> 00:27:02,280 purposes and we see that we cannot attain them 551 00:27:02,280 --> 00:27:05,670 but by digital technologies, then I think then we might 552 00:27:05,670 --> 00:27:08,960 be seen as obliged to use them. 553 00:27:08,960 --> 00:27:10,080 URS GASSER: Wonderful. 554 00:27:10,080 --> 00:27:11,130 Great segue. 555 00:27:11,130 --> 00:27:14,140 You pointed out sort of the big questions, 556 00:27:14,140 --> 00:27:16,680 but also that these questions can only 557 00:27:16,680 --> 00:27:20,070 be answered or worked through in a particular application 558 00:27:20,070 --> 00:27:20,880 context. 559 00:27:20,880 --> 00:27:22,590 You mentioned already a few. 560 00:27:22,590 --> 00:27:26,050 And if I may, to get a little bit more specific 561 00:27:26,050 --> 00:27:29,850 and put the conversation from 34-- 562 00:27:29,850 --> 00:27:33,480 you know, go from 30,000 feet a bit lower to 10,000 feet, 563 00:27:33,480 --> 00:27:35,820 maybe, and take two examples that 564 00:27:35,820 --> 00:27:37,770 illustrate some sort of the struggle, how 565 00:27:37,770 --> 00:27:43,150 we embrace opportunities, but also protect against risks. 566 00:27:43,150 --> 00:27:46,560 And Matthew and Crystal, as I already 567 00:27:46,560 --> 00:27:49,860 mentioned in the introduction, you 568 00:27:49,860 --> 00:27:53,250 have interesting work that's sort of serves 569 00:27:53,250 --> 00:27:54,780 as a case study in our context. 570 00:27:54,780 --> 00:27:58,860 Matthew, focusing on health and public health and the role 571 00:27:58,860 --> 00:28:02,250 of technology, whether it's AI or IoT, 572 00:28:02,250 --> 00:28:05,730 how are some of these questions that you've 573 00:28:05,730 --> 00:28:09,370 identified crystallizing, and where do you see things going? 574 00:28:09,370 --> 00:28:11,160 What are some of the concerns? 575 00:28:11,160 --> 00:28:12,410 What's the state of play? 576 00:28:12,410 --> 00:28:14,368 MATTHEW LIAO: Yeah, so good morning, everybody. 577 00:28:14,368 --> 00:28:18,280 So as a Professor Gasser has said, I'm a philosopher. 578 00:28:18,280 --> 00:28:20,960 And I have a book coming out called 579 00:28:20,960 --> 00:28:23,030 The Ethics of Artificial Intelligence coming out 580 00:28:23,030 --> 00:28:23,840 next March. 581 00:28:23,840 --> 00:28:26,780 And we cover a number of these different issues 582 00:28:26,780 --> 00:28:28,970 in the ethics of AI. 583 00:28:28,970 --> 00:28:31,595 And one of the applications of the ethics of AI 584 00:28:31,595 --> 00:28:33,830 is in the realm of health care. 585 00:28:33,830 --> 00:28:36,800 They are actually a lot of really exciting opportunities 586 00:28:36,800 --> 00:28:39,260 and a lot of development, a lot of things being done 587 00:28:39,260 --> 00:28:41,070 in the area of health care. 588 00:28:41,070 --> 00:28:44,270 So for example, machine learning is being deployed 589 00:28:44,270 --> 00:28:46,940 to screen cancer cells. 590 00:28:46,940 --> 00:28:51,800 It's found that it's almost as effective as radiologists. 591 00:28:51,800 --> 00:28:54,680 It's also been used in ophthalmology. 592 00:28:54,680 --> 00:29:00,260 It's been used to screen, to figure out whether a embryo is 593 00:29:00,260 --> 00:29:03,350 going to be viable or not. 594 00:29:03,350 --> 00:29:05,480 Natural language processing is being 595 00:29:05,480 --> 00:29:08,180 used to figure out whether people 596 00:29:08,180 --> 00:29:10,290 are having suicidal thoughts. 597 00:29:10,290 --> 00:29:12,740 So there are a lot of really exciting developments 598 00:29:12,740 --> 00:29:15,650 that are currently underway. 599 00:29:15,650 --> 00:29:18,560 And what that means for us is that it 600 00:29:18,560 --> 00:29:24,230 can really, for example, reduce health care costs in the US. 601 00:29:24,230 --> 00:29:28,250 I think we spend about $3 to $4 trillion in health care 602 00:29:28,250 --> 00:29:29,340 each year. 603 00:29:29,340 --> 00:29:33,170 And so one of the things that machine learning can do 604 00:29:33,170 --> 00:29:36,920 is reduce administrative costs in health care, for example. 605 00:29:36,920 --> 00:29:41,960 It can also assist facilitating drug discovery. 606 00:29:41,960 --> 00:29:45,740 And finally, another example is, it 607 00:29:45,740 --> 00:29:48,720 can really realize the vision of precision medicine. 608 00:29:48,720 --> 00:29:52,340 So for example, Fitbits, wearables, 609 00:29:52,340 --> 00:29:55,040 to figure out healthy lifestyles, 610 00:29:55,040 --> 00:29:57,650 what you should be eating, your calorie 611 00:29:57,650 --> 00:29:59,850 intakes and so on and so forth. 612 00:29:59,850 --> 00:30:03,180 So all those are really, really exciting developments. 613 00:30:03,180 --> 00:30:05,570 I'm an ethicist, so I also think about some 614 00:30:05,570 --> 00:30:06,810 of the ethical problems. 615 00:30:06,810 --> 00:30:08,600 And I just want to very quickly share 616 00:30:08,600 --> 00:30:12,360 some of the ethical concerns with you as well. 617 00:30:12,360 --> 00:30:15,380 So one of the biggest challenges with machine learning 618 00:30:15,380 --> 00:30:18,390 is that it requires a lot of data. 619 00:30:18,390 --> 00:30:20,600 And so what that means is, someone's 620 00:30:20,600 --> 00:30:22,760 got to go out there and collect all these data. 621 00:30:22,760 --> 00:30:25,650 And then you get into issues about privacy, 622 00:30:25,650 --> 00:30:26,860 especially in health care. 623 00:30:26,860 --> 00:30:29,250 It's personal data that we're talking about. 624 00:30:29,250 --> 00:30:34,100 So one obvious example is Facebook and Cambridge 625 00:30:34,100 --> 00:30:37,220 Analytica collecting a lot of information 626 00:30:37,220 --> 00:30:39,510 from Facebook users. 627 00:30:39,510 --> 00:30:42,000 Another example is GlaxoSmithKline. 628 00:30:42,000 --> 00:30:45,140 They just recently bought this company 23andMe, 629 00:30:45,140 --> 00:30:49,400 which is ancestry type, you upload your information, 630 00:30:49,400 --> 00:30:51,000 it gets your genetic information. 631 00:30:51,000 --> 00:30:53,140 So now they have all the database. 632 00:30:53,140 --> 00:30:56,000 And so one of the things we need to really be worried about 633 00:30:56,000 --> 00:31:00,740 is whether they're collecting the data appropriately, 634 00:31:00,740 --> 00:31:03,620 are they violating rights, what are the implications 635 00:31:03,620 --> 00:31:05,520 of the individuals? 636 00:31:05,520 --> 00:31:08,400 Another issue is going to be the garbage in, 637 00:31:08,400 --> 00:31:09,800 garbage out problem. 638 00:31:09,800 --> 00:31:13,970 So the algorithms that we're using today 639 00:31:13,970 --> 00:31:17,580 are going to only be as good as the data themselves. 640 00:31:17,580 --> 00:31:19,880 And so what we're finding is that 641 00:31:19,880 --> 00:31:22,800 sometimes the data sets that we're collecting are not-- 642 00:31:22,800 --> 00:31:25,400 they don't have accurate representations 643 00:31:25,400 --> 00:31:26,660 of the subjects. 644 00:31:26,660 --> 00:31:29,360 So take for example self-driving cars. 645 00:31:29,360 --> 00:31:32,320 It turns out that self-driving cars, 646 00:31:32,320 --> 00:31:35,900 they're not so good at detecting people of color, 647 00:31:35,900 --> 00:31:41,480 because the training sets, the training data that they use, 648 00:31:41,480 --> 00:31:47,150 they don't have enough of a set of people of color in the data 649 00:31:47,150 --> 00:31:47,760 set. 650 00:31:47,760 --> 00:31:52,100 And so that's a problem when we deploy those sort of data, sort 651 00:31:52,100 --> 00:31:54,620 of the algorithm in the wild. 652 00:31:54,620 --> 00:31:56,750 And I'll just say one more thing. 653 00:31:56,750 --> 00:32:00,230 The biggest concern I have with machine learning right now 654 00:32:00,230 --> 00:32:02,100 is there's something called deep learning. 655 00:32:02,100 --> 00:32:04,260 And deep learning is actually a technical term. 656 00:32:04,260 --> 00:32:12,650 It just means that it's using a big network to figure out 657 00:32:12,650 --> 00:32:15,770 how a machine should act. 658 00:32:15,770 --> 00:32:20,960 And it's powered a lot of the recent developments since 2012. 659 00:32:20,960 --> 00:32:24,940 It's powered a lot of the new breakthroughs. 660 00:32:24,940 --> 00:32:26,690 But one of the problems with deep learning 661 00:32:26,690 --> 00:32:29,340 is that it just doesn't capture the causality, 662 00:32:29,340 --> 00:32:30,920 the causal relations. 663 00:32:30,920 --> 00:32:33,930 It doesn't really understand what it's doing. 664 00:32:33,930 --> 00:32:38,130 And so it's linear regression. 665 00:32:38,130 --> 00:32:39,620 It's a lot of math. 666 00:32:39,620 --> 00:32:42,140 But here's one problem. 667 00:32:42,140 --> 00:32:46,710 So there's something called generative adversarial network. 668 00:32:46,710 --> 00:32:48,420 It's a type of single-- 669 00:32:48,420 --> 00:32:50,540 so one type of attack is something 670 00:32:50,540 --> 00:32:53,900 called a single pixel attack. 671 00:32:53,900 --> 00:32:57,300 So machine learning is very good at image classification. 672 00:32:57,300 --> 00:33:01,110 They can take images, and they classify them very accurately. 673 00:33:01,110 --> 00:33:03,410 But science-- researchers have found 674 00:33:03,410 --> 00:33:08,000 that if you just take an image, say an image of a car, 675 00:33:08,000 --> 00:33:09,950 and you just take one pixel and you change it 676 00:33:09,950 --> 00:33:12,470 from black to white, the machine learning 677 00:33:12,470 --> 00:33:14,040 will completely screw it up. 678 00:33:14,040 --> 00:33:16,990 So for example, with the image of the car, 679 00:33:16,990 --> 00:33:23,910 it will now classify that image as a dog with 99% confidence. 680 00:33:23,910 --> 00:33:26,830 And just imagine deploying that type of machine learning 681 00:33:26,830 --> 00:33:29,900 in the context of health care, when people's lives are 682 00:33:29,900 --> 00:33:34,040 at stake, or in the context of self-driving cars, right? 683 00:33:34,040 --> 00:33:36,490 And so I think we're going to get 684 00:33:36,490 --> 00:33:38,500 into more of these discussions later, 685 00:33:38,500 --> 00:33:41,950 but I think that's where we have to be careful about rolling out 686 00:33:41,950 --> 00:33:45,043 these technologies. 687 00:33:45,043 --> 00:33:46,960 URS GASSER: Crystal, does that sound familiar, 688 00:33:46,960 --> 00:33:49,270 listening to these stories from health 689 00:33:49,270 --> 00:33:52,690 when you look at your work on the use of algorithms and data 690 00:33:52,690 --> 00:33:55,023 in the criminal justice system or where are differences? 691 00:33:55,023 --> 00:33:56,398 CRYSTAL YANG: Yeah, I think there 692 00:33:56,398 --> 00:33:57,860 are a lot of similarities. 693 00:33:57,860 --> 00:34:00,220 And I think as some of the other panelists 694 00:34:00,220 --> 00:34:03,460 have pointed out, while algorithms are now basically 695 00:34:03,460 --> 00:34:06,140 used in so many parts of society, 696 00:34:06,140 --> 00:34:07,930 one of the areas where they have had 697 00:34:07,930 --> 00:34:10,420 a very dramatic increase in usage 698 00:34:10,420 --> 00:34:13,210 is the United States criminal justice system. 699 00:34:13,210 --> 00:34:16,630 And the algorithms here we often call risk assessment 700 00:34:16,630 --> 00:34:20,139 instruments, because what the algorithms are trying to do 701 00:34:20,139 --> 00:34:23,230 is predict somebody's future criminality. 702 00:34:23,230 --> 00:34:25,420 And these instruments now are used 703 00:34:25,420 --> 00:34:29,170 at various stages of the criminal justice system, things 704 00:34:29,170 --> 00:34:33,070 from policing, to pretrial and bail decisions, 705 00:34:33,070 --> 00:34:37,000 to sentencing, to probation and parole as well. 706 00:34:37,000 --> 00:34:41,000 And just some examples, take predictive policing. 707 00:34:41,000 --> 00:34:42,790 One of the common technologies is 708 00:34:42,790 --> 00:34:46,270 called Pred Poll, which is used by the Los Angeles Police 709 00:34:46,270 --> 00:34:49,090 Department and over 60 other police departments 710 00:34:49,090 --> 00:34:50,699 across the United States. 711 00:34:50,699 --> 00:34:53,710 It uses historical data on crime types, 712 00:34:53,710 --> 00:34:55,870 where crimes have happened, to predict 713 00:34:55,870 --> 00:34:59,830 the future incidents of different criminal incidents. 714 00:34:59,830 --> 00:35:02,320 In sentencing, now many states allow 715 00:35:02,320 --> 00:35:04,810 judges to consider risk scores that 716 00:35:04,810 --> 00:35:07,360 are meant to predict the future risk of committing 717 00:35:07,360 --> 00:35:09,190 new criminal behavior. 718 00:35:09,190 --> 00:35:12,410 One common algorithm here, which has been in the news a lot, 719 00:35:12,410 --> 00:35:14,980 you may have heard of, is the Compass algorithm. 720 00:35:14,980 --> 00:35:17,470 It's a proprietary algorithm, so we actually 721 00:35:17,470 --> 00:35:20,200 don't know exactly the underlying algorithmic 722 00:35:20,200 --> 00:35:23,230 structure, that classifies individuals 723 00:35:23,230 --> 00:35:26,860 on a scale of 1 to 10 in terms of their predictive likelihood 724 00:35:26,860 --> 00:35:31,020 of recidivism, using how that person answers questions 725 00:35:31,020 --> 00:35:34,310 on a 137 question survey. 726 00:35:34,310 --> 00:35:36,310 So these are just some of the examples, 727 00:35:36,310 --> 00:35:40,870 and I think they raise a huge host of issues and challenges. 728 00:35:40,870 --> 00:35:44,590 One that I think requires a lot of understanding 729 00:35:44,590 --> 00:35:48,790 from philosophers and ethicists is, do these risk assessment 730 00:35:48,790 --> 00:35:53,150 tools have a role to play, even, in the criminal justice system? 731 00:35:53,150 --> 00:35:57,310 I think some view the endeavor of predicting future risk 732 00:35:57,310 --> 00:36:01,210 as wrongheaded, and believe that because of this input data 733 00:36:01,210 --> 00:36:06,130 garbage in, garbage out type of problem, that using algorithms 734 00:36:06,130 --> 00:36:09,700 to predict future risk will only entrench or potentially 735 00:36:09,700 --> 00:36:13,120 exacerbate inequalities and inequities that we 736 00:36:13,120 --> 00:36:15,980 see in society at large. 737 00:36:15,980 --> 00:36:18,780 On the other hand, and I place myself more in this camp, 738 00:36:18,780 --> 00:36:21,880 while there is acknowledgment that the algorithms are often 739 00:36:21,880 --> 00:36:25,390 imperfect, I think it's also important to consider 740 00:36:25,390 --> 00:36:27,330 the relevant counterfactual. 741 00:36:27,330 --> 00:36:30,340 The counterfactual is not a world free of inequality, 742 00:36:30,340 --> 00:36:31,360 of inequity. 743 00:36:31,360 --> 00:36:34,600 It's a counterfactual in which we have human decision makers. 744 00:36:34,600 --> 00:36:35,380 And guess what? 745 00:36:35,380 --> 00:36:38,440 There's lots of evidence that human decision makers 746 00:36:38,440 --> 00:36:41,830 have a big role to play in perpetuating inequalities 747 00:36:41,830 --> 00:36:43,970 through bias and inconsistency. 748 00:36:43,970 --> 00:36:46,240 So there's a role to, I think, to consider what are 749 00:36:46,240 --> 00:36:48,280 we comparing the algorithms to? 750 00:36:48,280 --> 00:36:49,990 It's not a perfect world. 751 00:36:49,990 --> 00:36:51,100 It's humans. 752 00:36:51,100 --> 00:36:53,170 I think another set of design questions 753 00:36:53,170 --> 00:36:56,540 that Matthew has gotten at is, in the criminal justice system, 754 00:36:56,540 --> 00:36:59,410 there are lots of open, unresolved questions 755 00:36:59,410 --> 00:37:01,870 about how do we design an algorithm? 756 00:37:01,870 --> 00:37:04,150 If we're going to predict risk, can we 757 00:37:04,150 --> 00:37:06,190 consider individual characteristics 758 00:37:06,190 --> 00:37:08,710 like somebody's race or ethnicity, what we often 759 00:37:08,710 --> 00:37:10,750 call protected characteristics? 760 00:37:10,750 --> 00:37:14,440 If you can't, can you consider non-protected characteristics, 761 00:37:14,440 --> 00:37:17,530 things like education, where somebody lives, 762 00:37:17,530 --> 00:37:22,660 which can effectively proxy for a person's race or ethnicity? 763 00:37:22,660 --> 00:37:24,700 There is also complicated questions 764 00:37:24,700 --> 00:37:26,980 about how to evaluate if an algorithm is 765 00:37:26,980 --> 00:37:29,050 doing what we want it to do. 766 00:37:29,050 --> 00:37:32,890 What does it mean, for instance, for an algorithm to be fair? 767 00:37:32,890 --> 00:37:35,710 It turns out here the law has not so much 768 00:37:35,710 --> 00:37:39,560 to say so far about how to define or measure fairness. 769 00:37:39,560 --> 00:37:43,150 And even outside the law, there's a very lively computer 770 00:37:43,150 --> 00:37:46,450 science debate about algorithmic fairness, where there's 771 00:37:46,450 --> 00:37:48,530 very different definitions of fairness, 772 00:37:48,530 --> 00:37:50,770 that in many circumstances we could all 773 00:37:50,770 --> 00:37:53,810 say that one sounds great or that one sounds great, 774 00:37:53,810 --> 00:37:55,750 but it's actually been shown mathematically 775 00:37:55,750 --> 00:37:59,440 that in many instances it is impossible to simultaneously 776 00:37:59,440 --> 00:38:02,870 satisfy all those notions of algorithmic fairness. 777 00:38:02,870 --> 00:38:05,650 And so then that requires a normative choice by us 778 00:38:05,650 --> 00:38:08,740 as a society or a legal system to choose 779 00:38:08,740 --> 00:38:11,740 which of those definitions of algorithmic fairness 780 00:38:11,740 --> 00:38:12,490 should dominate. 781 00:38:12,490 --> 00:38:15,520 So I think those are just a couple of the key issues 782 00:38:15,520 --> 00:38:17,967 and challenges that I see in the criminal justice system. 783 00:38:17,967 --> 00:38:19,550 URS GASSER: No shortage of challenges. 784 00:38:19,550 --> 00:38:21,133 CRYSTAL YANG: No shortage, absolutely. 785 00:38:21,133 --> 00:38:24,005 URS GASSER: It was clear from both stories. 786 00:38:24,005 --> 00:38:30,670 I'd love to build up on this and ask Wolfgang, 787 00:38:30,670 --> 00:38:34,450 Crystal made the point that part of it 788 00:38:34,450 --> 00:38:37,220 is a story about technology, but part of it 789 00:38:37,220 --> 00:38:39,980 also seems to be a story about society at large, 790 00:38:39,980 --> 00:38:44,300 about the institutions we already have in place. 791 00:38:44,300 --> 00:38:46,940 Part of it seems to be about human nature, 792 00:38:46,940 --> 00:38:48,530 with our own biases. 793 00:38:48,530 --> 00:38:52,910 So how do you think about that as we have this intense debate, 794 00:38:52,910 --> 00:38:58,430 debates about AI decision making versus human decision making? 795 00:38:58,430 --> 00:39:02,990 And should we replace judges by AIs or not? 796 00:39:02,990 --> 00:39:06,977 How much is it about technology, really? 797 00:39:06,977 --> 00:39:08,810 WOLFGANG SCHULZ: I think to respond to that, 798 00:39:08,810 --> 00:39:11,870 I have to go to flight level 10,000 again. 799 00:39:11,870 --> 00:39:14,690 I would say that I am descending later. 800 00:39:14,690 --> 00:39:16,608 801 00:39:16,608 --> 00:39:17,900 URS GASSER: Only in a good way. 802 00:39:17,900 --> 00:39:18,530 Don't worry. 803 00:39:18,530 --> 00:39:20,030 WOLFGANG SCHULZ: In a good way, yes. 804 00:39:20,030 --> 00:39:20,800 I hope. 805 00:39:20,800 --> 00:39:25,530 So when we talk about technology in expert circles 806 00:39:25,530 --> 00:39:28,970 but in society at large, then we very often 807 00:39:28,970 --> 00:39:30,930 have a distinction between here is the society, 808 00:39:30,930 --> 00:39:32,390 here is the technology. 809 00:39:32,390 --> 00:39:34,910 And that is a dangerous thing, because then we 810 00:39:34,910 --> 00:39:38,420 frame technology as a kind of natural disaster 811 00:39:38,420 --> 00:39:41,770 that is coming, and we have to build walls to cope with that. 812 00:39:41,770 --> 00:39:45,470 And we are not in the mode of creating 813 00:39:45,470 --> 00:39:47,990 the technology as a society and together 814 00:39:47,990 --> 00:39:49,710 with the different disciplines. 815 00:39:49,710 --> 00:39:51,453 So I think we have to be very careful 816 00:39:51,453 --> 00:39:53,120 how we talk about these things and where 817 00:39:53,120 --> 00:39:55,500 we talk about tensions. 818 00:39:55,500 --> 00:40:00,320 And I can, I think, build on what Crystal said, because we 819 00:40:00,320 --> 00:40:03,530 are doing some research on the criminal justice system 820 00:40:03,530 --> 00:40:05,930 as well, and have done recently. 821 00:40:05,930 --> 00:40:08,930 And what I find interesting is that when 822 00:40:08,930 --> 00:40:13,340 we talk about technology coming into processes, then 823 00:40:13,340 --> 00:40:15,710 we start thinking about what our quality 824 00:40:15,710 --> 00:40:18,230 measures are as a society here. 825 00:40:18,230 --> 00:40:20,810 And I had a discussion with German judges 826 00:40:20,810 --> 00:40:24,140 a couple of months ago, and we are talking about sentencing. 827 00:40:24,140 --> 00:40:28,010 And we are talking about AI supporting that, 828 00:40:28,010 --> 00:40:31,010 and then I raised the question of explainability, 829 00:40:31,010 --> 00:40:33,660 which is one of the issues in AI, 830 00:40:33,660 --> 00:40:37,790 that say we cannot really see and explain what happens there. 831 00:40:37,790 --> 00:40:40,460 And then one of the judges said, wait a moment. 832 00:40:40,460 --> 00:40:42,950 Ask me, can I explain what I am doing 833 00:40:42,950 --> 00:40:45,120 when I come to this decision? 834 00:40:45,120 --> 00:40:47,030 And I'm not sure that I really can do that. 835 00:40:47,030 --> 00:40:51,660 I can give a reason that is valid in the legal system, 836 00:40:51,660 --> 00:40:55,370 but I cannot really explain what my motives were here. 837 00:40:55,370 --> 00:40:58,770 And then we had a debate on what are the factors there. 838 00:40:58,770 --> 00:41:02,360 And in the German legal system and the criminal law, 839 00:41:02,360 --> 00:41:05,330 it's not very well elaborated what the criteria are. 840 00:41:05,330 --> 00:41:07,250 So it's very, very vague. 841 00:41:07,250 --> 00:41:08,930 And so we had a very fruitful debate 842 00:41:08,930 --> 00:41:11,120 on what the values actually are. 843 00:41:11,120 --> 00:41:14,240 And you can have the same in other fields of society. 844 00:41:14,240 --> 00:41:17,750 We have next week or the week after, 845 00:41:17,750 --> 00:41:20,540 a workshop with computer scientists 846 00:41:20,540 --> 00:41:24,800 and people from communication science and law, 847 00:41:24,800 --> 00:41:31,310 talking about how to understand diversity in recommender 848 00:41:31,310 --> 00:41:32,990 systems for the media. 849 00:41:32,990 --> 00:41:36,720 And we want to come up with ideas what that actually is. 850 00:41:36,720 --> 00:41:38,390 And then you have to go back, and what 851 00:41:38,390 --> 00:41:40,920 do you want as a society actually, 852 00:41:40,920 --> 00:41:42,990 when you talk about diversity? 853 00:41:42,990 --> 00:41:46,280 So I think that's a good thing that technology forces 854 00:41:46,280 --> 00:41:51,710 us to ask those hard questions about societal values 855 00:41:51,710 --> 00:41:54,920 and to better understand what makes human decision making 856 00:41:54,920 --> 00:41:55,880 so special. 857 00:41:55,880 --> 00:41:58,610 We are talking a lot about things like tested knowledge 858 00:41:58,610 --> 00:42:02,420 and tested norms, things that we all understand because we 859 00:42:02,420 --> 00:42:04,020 are part of the society. 860 00:42:04,020 --> 00:42:06,380 And we cannot really explain why we do that this 861 00:42:06,380 --> 00:42:10,310 way or that way, because it's tacit knowledge or tacit norms. 862 00:42:10,310 --> 00:42:13,460 And that is something that you cannot really, now, 863 00:42:13,460 --> 00:42:15,500 I would say, build into technology. 864 00:42:15,500 --> 00:42:18,500 That would require technology to be part of society 865 00:42:18,500 --> 00:42:21,900 and learn in interaction, and I think we are far from that 866 00:42:21,900 --> 00:42:22,400 so far. 867 00:42:22,400 --> 00:42:27,470 So I believe that this is a twist of the debate 868 00:42:27,470 --> 00:42:30,860 that very often we do not really include 869 00:42:30,860 --> 00:42:34,640 in our conversation, when we talk about this society 870 00:42:34,640 --> 00:42:38,740 here, technology there aspect. 871 00:42:38,740 --> 00:42:40,610 URS GASSER: So if Wolfgang is right, 872 00:42:40,610 --> 00:42:43,670 and he is most often right, as we know-- 873 00:42:43,670 --> 00:42:45,170 JEANETTE HOFMANN: I don't deny that. 874 00:42:45,170 --> 00:42:48,470 URS GASSER: And technology is deeply embedded in society, 875 00:42:48,470 --> 00:42:52,220 and as we heard the president opening, in his opening 876 00:42:52,220 --> 00:42:54,740 remarks, we as societies are in a learning 877 00:42:54,740 --> 00:42:58,550 process ourselves, how to cope with massive challenges 878 00:42:58,550 --> 00:43:02,180 and transformations of all sorts. 879 00:43:02,180 --> 00:43:04,520 Based on the work you've been doing, following 880 00:43:04,520 --> 00:43:07,160 early debates around internet regulation 881 00:43:07,160 --> 00:43:11,510 and approaches to governance, what's currently 882 00:43:11,510 --> 00:43:14,480 happening in this societal learning process 883 00:43:14,480 --> 00:43:18,980 as we try to identify and agree and regulate 884 00:43:18,980 --> 00:43:21,890 good uses versus bad uses across different contexts, 885 00:43:21,890 --> 00:43:23,980 and we only highlighted two examples 886 00:43:23,980 --> 00:43:26,050 and could add many more. 887 00:43:26,050 --> 00:43:29,420 What sorts of norms are emerging and what 888 00:43:29,420 --> 00:43:34,125 some of the dynamics around these norms as you observe? 889 00:43:34,125 --> 00:43:35,500 JEANETTE HOFMANN: Thank you, Urs. 890 00:43:35,500 --> 00:43:37,990 I could talk for hours on this question. 891 00:43:37,990 --> 00:43:40,900 I really like it. 892 00:43:40,900 --> 00:43:42,610 Let me go one step back. 893 00:43:42,610 --> 00:43:46,120 At the time when the internet and digital technologies-- 894 00:43:46,120 --> 00:43:47,470 URS GASSER: Switch off the mic. 895 00:43:47,470 --> 00:43:49,560 On the mic. 896 00:43:49,560 --> 00:43:56,000 FRANK-WALTER STEINMEIER: [SPEAKING GERMAN] 897 00:43:56,000 --> 00:43:57,700 JEANETTE HOFMANN: OK. 898 00:43:57,700 --> 00:43:58,408 Around the time-- 899 00:43:58,408 --> 00:44:00,283 FRANK-WALTER STEINMEIER: Well, that's better. 900 00:44:00,283 --> 00:44:02,770 JEANETTE HOFMANN: --Digital technologies really 901 00:44:02,770 --> 00:44:06,800 became more present in our societies, 902 00:44:06,800 --> 00:44:10,720 Western societies went through a long period 903 00:44:10,720 --> 00:44:16,420 of privatization and liberation from old state monopolies. 904 00:44:16,420 --> 00:44:22,390 And we thought of that the force of the internet 905 00:44:22,390 --> 00:44:27,550 as a form of liberalization, and that kind of idea 906 00:44:27,550 --> 00:44:32,380 of self-regulation and "let the markets determine the future," 907 00:44:32,380 --> 00:44:35,950 we thought that this was a very good alternative. 908 00:44:35,950 --> 00:44:40,960 And this we have driven to a point where we now 909 00:44:40,960 --> 00:44:45,070 regard digital technologies nearly 910 00:44:45,070 --> 00:44:48,100 as a self-driving autonomous force. 911 00:44:48,100 --> 00:44:54,070 We ascribe a lot of power and agency to digital technologies 912 00:44:54,070 --> 00:44:58,120 themselves and the companies who develop them. 913 00:44:58,120 --> 00:45:00,790 I would say that the debate we see now 914 00:45:00,790 --> 00:45:05,170 about AI and ethical frameworks is 915 00:45:05,170 --> 00:45:10,360 an echo of that, the idea that ethical principles might 916 00:45:10,360 --> 00:45:14,980 be good enough to give us an orientation for the future 917 00:45:14,980 --> 00:45:17,080 of artificial intelligence. 918 00:45:17,080 --> 00:45:20,980 But we need to ask ourselves whether we 919 00:45:20,980 --> 00:45:25,690 get enough accountability out of ethical guidelines 920 00:45:25,690 --> 00:45:27,310 and frameworks. 921 00:45:27,310 --> 00:45:29,710 I just came back from the West Coast, 922 00:45:29,710 --> 00:45:33,730 where you see really a change of wind. 923 00:45:33,730 --> 00:45:36,760 Companies now begin to wonder whether they 924 00:45:36,760 --> 00:45:42,910 do not need a legal framework for the future development. 925 00:45:42,910 --> 00:45:46,300 Such a legal framework could be, for example, 926 00:45:46,300 --> 00:45:50,110 anchored in human rights, and legislation could 927 00:45:50,110 --> 00:45:52,970 build on fundamental rights. 928 00:45:52,970 --> 00:45:57,350 They could set limits to future developments, 929 00:45:57,350 --> 00:46:00,560 also to make us see that finally it is 930 00:46:00,560 --> 00:46:02,980 society that shapes technology. 931 00:46:02,980 --> 00:46:06,100 It's not that technology sets its own rules. 932 00:46:06,100 --> 00:46:09,760 But we are not really aware of it, I think, at the moment. 933 00:46:09,760 --> 00:46:12,670 We nearly have lost the capability 934 00:46:12,670 --> 00:46:16,990 to see and to recognize how we change technologies 935 00:46:16,990 --> 00:46:18,410 as societies. 936 00:46:18,410 --> 00:46:22,990 So we need to perhaps turn around a bit, 937 00:46:22,990 --> 00:46:26,590 give up this idea of complete self-regulation, 938 00:46:26,590 --> 00:46:30,520 and come to new models that sit somewhere 939 00:46:30,520 --> 00:46:35,350 in between a market approach and a pure government approach. 940 00:46:35,350 --> 00:46:38,140 We need new regulatory frameworks 941 00:46:38,140 --> 00:46:41,650 that need to work across national boundaries, 942 00:46:41,650 --> 00:46:45,380 even though we can, I think, not hope 943 00:46:45,380 --> 00:46:47,350 for multilateral approaches. 944 00:46:47,350 --> 00:46:51,850 We need something below, and the GDPR, the General Data 945 00:46:51,850 --> 00:46:56,260 Protection Regulation that the European Commission introduced, 946 00:46:56,260 --> 00:46:59,050 is often mentioned as a gold standard 947 00:46:59,050 --> 00:47:00,820 for that kind of approach. 948 00:47:00,820 --> 00:47:03,700 Perhaps some countries can get together, 949 00:47:03,700 --> 00:47:08,810 build a legal framework and export it via trade agreements. 950 00:47:08,810 --> 00:47:10,980 [APPLAUSE] 951 00:47:10,980 --> 00:47:12,138 952 00:47:12,138 --> 00:47:13,680 URS GASSER: That is some good advice. 953 00:47:13,680 --> 00:47:15,385 Thank you, Jeanette. 954 00:47:15,385 --> 00:47:20,050 So a couple of things that I would like to follow up on. 955 00:47:20,050 --> 00:47:22,450 One is this role of the ethics principles. 956 00:47:22,450 --> 00:47:24,190 You mentioned there is a flourishing 957 00:47:24,190 --> 00:47:26,590 of ethical principles around AI in particular. 958 00:47:26,590 --> 00:47:30,610 I think 130 or something are out there. 959 00:47:30,610 --> 00:47:35,920 We tried to map some of them, but it's getting quite a task. 960 00:47:35,920 --> 00:47:39,310 But on the other hand side, given also 961 00:47:39,310 --> 00:47:43,330 Wolfgang's remarks and the opening statement by Herr 962 00:47:43,330 --> 00:47:47,230 Bundespresident, there is value to these ethical debates, 963 00:47:47,230 --> 00:47:48,570 nonetheless, right? 964 00:47:48,570 --> 00:47:50,260 And you also make this point, of course, 965 00:47:50,260 --> 00:47:53,980 that we need all different approaches and tools, probably 966 00:47:53,980 --> 00:47:55,520 including law but also ethics. 967 00:47:55,520 --> 00:47:59,680 And if I may ask you, how do you think 968 00:47:59,680 --> 00:48:01,390 about these ethical principles? 969 00:48:01,390 --> 00:48:05,080 What's the value in these ethical norms 970 00:48:05,080 --> 00:48:07,720 crystallizing guidelines and things like that, 971 00:48:07,720 --> 00:48:09,550 whether it's for companies or an act 972 00:48:09,550 --> 00:48:12,730 by international organizations like OECD or even 973 00:48:12,730 --> 00:48:15,860 by nation states? 974 00:48:15,860 --> 00:48:17,410 What's the promise, but also what 975 00:48:17,410 --> 00:48:19,810 are the limitations of ethical approaches 976 00:48:19,810 --> 00:48:24,610 of this sort when we deal with these complex, messy problems? 977 00:48:24,610 --> 00:48:27,380 EVA WEBER-GURSKA: Yeah, so ethics and law of course 978 00:48:27,380 --> 00:48:30,230 have to be distinguished, although they are connected. 979 00:48:30,230 --> 00:48:34,490 Ethics is, I would say, the explicit formulation 980 00:48:34,490 --> 00:48:37,610 of implicit norms that guide or should 981 00:48:37,610 --> 00:48:43,700 guide our everyday actions in our life, our living together. 982 00:48:43,700 --> 00:48:48,500 And law, it's the core of the organization 983 00:48:48,500 --> 00:48:53,390 of a state or a nation, transforms some of these norms 984 00:48:53,390 --> 00:48:56,810 into concrete rules, the infringement of which 985 00:48:56,810 --> 00:48:59,900 then is bound up with sanctions by the state. 986 00:48:59,900 --> 00:49:01,550 So this is something different. 987 00:49:01,550 --> 00:49:04,910 And not all moral norms are legal norms and vice versa, 988 00:49:04,910 --> 00:49:05,960 of course. 989 00:49:05,960 --> 00:49:10,400 But ethical guidelines now for new topics like digitization 990 00:49:10,400 --> 00:49:13,190 can be, I think, helpful first steps 991 00:49:13,190 --> 00:49:18,800 to show something that then can be transformed into law, too. 992 00:49:18,800 --> 00:49:20,500 URS GASSER: Speaking of law, what's 993 00:49:20,500 --> 00:49:23,420 your hope that, looking at your area of research, 994 00:49:23,420 --> 00:49:29,180 the law will evolve in this dynamic situation, where 995 00:49:29,180 --> 00:49:31,970 maybe ethical principles may lead the way? 996 00:49:31,970 --> 00:49:33,620 Where do you see the promise of law 997 00:49:33,620 --> 00:49:36,140 in these debates, where we're facing 998 00:49:36,140 --> 00:49:39,110 this shift from the human towards the machine? 999 00:49:39,110 --> 00:49:42,170 CRYSTAL YANG: Yeah, I think law has a very important role 1000 00:49:42,170 --> 00:49:42,950 to play here. 1001 00:49:42,950 --> 00:49:45,500 I think I share Jeanette's general sense 1002 00:49:45,500 --> 00:49:48,020 that self-regulation is probably not 1003 00:49:48,020 --> 00:49:50,480 going to be a sufficient solution, that there 1004 00:49:50,480 --> 00:49:52,800 have to be legal interventions. 1005 00:49:52,800 --> 00:49:55,010 And the law is both instrumental in 1006 00:49:55,010 --> 00:49:58,580 that it will undoubtedly, by deciding what to permit 1007 00:49:58,580 --> 00:50:01,820 and what to prohibit, shape the behavior of governments, 1008 00:50:01,820 --> 00:50:03,740 private companies, in terms of how 1009 00:50:03,740 --> 00:50:07,280 they design algorithms, how they implement them on the ground. 1010 00:50:07,280 --> 00:50:10,340 The law, I think, also has important expressive principles 1011 00:50:10,340 --> 00:50:13,580 maybe related to ethics, where if the law allows 1012 00:50:13,580 --> 00:50:16,430 for something, then citizens, members of society, 1013 00:50:16,430 --> 00:50:19,410 will view something as maybe more socially acceptable. 1014 00:50:19,410 --> 00:50:22,520 So I think the law here has a big role to play. 1015 00:50:22,520 --> 00:50:24,650 Coming back to the criminal justice system, though, 1016 00:50:24,650 --> 00:50:27,800 I think there are many ways in which the current law, 1017 00:50:27,800 --> 00:50:29,930 certainly in the United States, falls 1018 00:50:29,930 --> 00:50:33,290 short for a lot of the new challenges that 1019 00:50:33,290 --> 00:50:35,180 might come with algorithms. 1020 00:50:35,180 --> 00:50:37,820 So to give you some examples, many people 1021 00:50:37,820 --> 00:50:40,940 are troubled by the use of disparities 1022 00:50:40,940 --> 00:50:44,480 that can emerge when you use an algorithm to make decisions. 1023 00:50:44,480 --> 00:50:47,210 And that could be because of the data or the structure 1024 00:50:47,210 --> 00:50:48,690 of the algorithm. 1025 00:50:48,690 --> 00:50:50,300 Now, it turns out there's probably 1026 00:50:50,300 --> 00:50:53,300 pretty limited legal remedies for addressing 1027 00:50:53,300 --> 00:50:54,950 those disparities. 1028 00:50:54,950 --> 00:50:58,610 Under current US law, a finding of discrimination 1029 00:50:58,610 --> 00:51:02,030 under the Equal Protection Clause of the US Constitution 1030 00:51:02,030 --> 00:51:05,390 would require a showing of discriminatory intent 1031 00:51:05,390 --> 00:51:06,620 or purpose. 1032 00:51:06,620 --> 00:51:10,100 And that's hard, because when an algorithmic designer chooses 1033 00:51:10,100 --> 00:51:12,870 to use a variable or certain types of data, 1034 00:51:12,870 --> 00:51:15,590 there's probably often no discriminatory intent 1035 00:51:15,590 --> 00:51:16,550 or purpose. 1036 00:51:16,550 --> 00:51:18,255 And yet because so many variables 1037 00:51:18,255 --> 00:51:21,440 can be proxies for things we're troubled by, 1038 00:51:21,440 --> 00:51:24,390 there's often maybe no direct legal remedy. 1039 00:51:24,390 --> 00:51:26,120 And so this traditional requirement 1040 00:51:26,120 --> 00:51:28,670 we've had in the US Constitution and case 1041 00:51:28,670 --> 00:51:31,540 law of requiring intent and motive 1042 00:51:31,540 --> 00:51:33,890 is often ill suited to addressing 1043 00:51:33,890 --> 00:51:37,490 the new types of problems that the algorithms can introduce. 1044 00:51:37,490 --> 00:51:39,890 Moreover, it's actually been the case in the US 1045 00:51:39,890 --> 00:51:43,880 that many have interpreted the case law on discrimination 1046 00:51:43,880 --> 00:51:47,450 as requiring or prohibiting the use of characteristics 1047 00:51:47,450 --> 00:51:48,780 like race or ethnicity. 1048 00:51:48,780 --> 00:51:52,080 You can not use them in any way, shape, or form. 1049 00:51:52,080 --> 00:51:55,610 But the reality is that because of the complex statistical 1050 00:51:55,610 --> 00:52:00,230 relationships underlying many variables, I, other computer 1051 00:52:00,230 --> 00:52:02,870 scientists, economists have written and shown 1052 00:52:02,870 --> 00:52:06,080 that those proxy effects that we may be worried about 1053 00:52:06,080 --> 00:52:09,890 are often created because of the prohibition on the use 1054 00:52:09,890 --> 00:52:11,400 of those characteristics. 1055 00:52:11,400 --> 00:52:14,180 And that once you take statistics into account, 1056 00:52:14,180 --> 00:52:17,330 you may actually want to use protected characteristics 1057 00:52:17,330 --> 00:52:21,330 in certain forms to actually remedy those disparities. 1058 00:52:21,330 --> 00:52:23,610 And so it's actually this problem right now, 1059 00:52:23,610 --> 00:52:26,930 where I think the law is pushing companies, governments 1060 00:52:26,930 --> 00:52:29,960 to develop versions of algorithms that may actually 1061 00:52:29,960 --> 00:52:33,200 be counterproductive to our larger societal goal 1062 00:52:33,200 --> 00:52:35,690 of equality and opportunity. 1063 00:52:35,690 --> 00:52:38,780 And I think, to the earlier point about human decision 1064 00:52:38,780 --> 00:52:42,320 making, the law often does not consider counterfactuals 1065 00:52:42,320 --> 00:52:43,850 in a very easy way. 1066 00:52:43,850 --> 00:52:47,510 It often seems to require perfection for algorithms, 1067 00:52:47,510 --> 00:52:48,560 explainability. 1068 00:52:48,560 --> 00:52:51,380 But as you point out, what is more a black box 1069 00:52:51,380 --> 00:52:53,390 than what is in a judge's mind? 1070 00:52:53,390 --> 00:52:56,300 Perhaps the judge's mind is more of a black box 1071 00:52:56,300 --> 00:52:59,830 than a neural network or other forms of machine learning. 1072 00:52:59,830 --> 00:53:03,710 And so I worry that the law, by sometimes requiring perfection 1073 00:53:03,710 --> 00:53:06,170 and not considering the counterfactual, 1074 00:53:06,170 --> 00:53:11,060 will often chill and deter what may be innovative and good uses 1075 00:53:11,060 --> 00:53:14,310 of algorithmic decision making. 1076 00:53:14,310 --> 00:53:16,250 URS GASSER: So also the relationship 1077 00:53:16,250 --> 00:53:19,910 between technology and law and law and ethics 1078 00:53:19,910 --> 00:53:22,540 is very complicated and bi-directional, 1079 00:53:22,540 --> 00:53:25,340 with unintended consequences included. 1080 00:53:25,340 --> 00:53:27,160 FRANK-WALTER STEINMEIER: [SPEAKING GERMAN] 1081 00:53:27,160 --> 00:53:32,100 INTERPRETER: Still, if I may, some of you 1082 00:53:32,100 --> 00:53:38,940 have put this on the same, on a par in a way that does not 1083 00:53:38,940 --> 00:53:43,160 convince me of any decision taken by an algorithm 1084 00:53:43,160 --> 00:53:44,450 being plausible. 1085 00:53:44,450 --> 00:53:46,830 You know, the fact that we don't always 1086 00:53:46,830 --> 00:53:52,080 understand algorithms and software that 1087 00:53:52,080 --> 00:53:58,040 is guided by algorithms, that leaves me very concerned. 1088 00:53:58,040 --> 00:54:00,030 We say, and the situation in America 1089 00:54:00,030 --> 00:54:03,130 is slightly different from the situation in Germany. 1090 00:54:03,130 --> 00:54:06,960 A judge passes a sentence or a judgment, 1091 00:54:06,960 --> 00:54:10,140 he or she has to justify that decision. 1092 00:54:10,140 --> 00:54:13,600 Not every ruling or decision has to be accepted. 1093 00:54:13,600 --> 00:54:15,240 People may have a different opinion. 1094 00:54:15,240 --> 00:54:17,950 But as a rule, as far as the tradition in Germany 1095 00:54:17,950 --> 00:54:21,330 is concerned, you do have a very extensive duty 1096 00:54:21,330 --> 00:54:24,690 to justify your sentence or your ruling. 1097 00:54:24,690 --> 00:54:29,280 And that is what lacks when you talk about algorithms. 1098 00:54:29,280 --> 00:54:31,650 That is one of the questions that we need to discuss, 1099 00:54:31,650 --> 00:54:33,570 I believe. 1100 00:54:33,570 --> 00:54:39,290 Is it conceivable at all that algorithms, 1101 00:54:39,290 --> 00:54:41,730 that the control of our algorithms 1102 00:54:41,730 --> 00:54:44,280 can become or be made more transparent? 1103 00:54:44,280 --> 00:54:48,310 Of course, not towards each and every individual person, 1104 00:54:48,310 --> 00:54:51,720 but perhaps with regard to those who 1105 00:54:51,720 --> 00:54:53,790 consider themselves the representative 1106 00:54:53,790 --> 00:54:54,600 of the government. 1107 00:54:54,600 --> 00:54:58,440 You know, the body in question responsible for protecting 1108 00:54:58,440 --> 00:55:03,300 the rights and the freedoms of the individual. 1109 00:55:03,300 --> 00:55:04,290 Can you hear me? 1110 00:55:04,290 --> 00:55:07,110 1111 00:55:07,110 --> 00:55:09,390 And a second remark that I'd like 1112 00:55:09,390 --> 00:55:11,190 to make against the backdrop of what 1113 00:55:11,190 --> 00:55:15,990 has just been said, it is good that we have a debate 1114 00:55:15,990 --> 00:55:19,290 up here on the rostrum, so to speak, 1115 00:55:19,290 --> 00:55:24,580 about the ethical principles for digital transformation. 1116 00:55:24,580 --> 00:55:26,400 But what struck me, and I've tried 1117 00:55:26,400 --> 00:55:29,850 to refer to that in my introductory remarks, 1118 00:55:29,850 --> 00:55:35,940 when I bring together a group of experts in my office, 1119 00:55:35,940 --> 00:55:39,990 I have experts briefing me about the technological potential 1120 00:55:39,990 --> 00:55:42,415 of AI. 1121 00:55:42,415 --> 00:55:45,030 But, you know, I have an idea after these talks of what 1122 00:55:45,030 --> 00:55:46,760 is doable, what is conceivable. 1123 00:55:46,760 --> 00:55:50,580 But when I have talks about the ethical limits of digitization, 1124 00:55:50,580 --> 00:55:52,980 it brings together a wholly different group of people, 1125 00:55:52,980 --> 00:55:57,610 because as a rule, I do not meet IT experts or engineers, 1126 00:55:57,610 --> 00:55:59,910 but I meet social scientists, philosophers, 1127 00:55:59,910 --> 00:56:05,850 political scientists, which is an indication, to some extent, 1128 00:56:05,850 --> 00:56:08,220 of something that keeps me deeply troubled. 1129 00:56:08,220 --> 00:56:10,373 And that is that we have a debate, not 1130 00:56:10,373 --> 00:56:14,200 that that debate takes place within closed circles, 1131 00:56:14,200 --> 00:56:15,090 closed communities. 1132 00:56:15,090 --> 00:56:19,440 That is to say, we have a debate about the ethical limits 1133 00:56:19,440 --> 00:56:24,330 modality of a digital age and that we 1134 00:56:24,330 --> 00:56:26,820 have to bear in mind with limits that we should 1135 00:56:26,820 --> 00:56:29,760 not to surpass or overstep. 1136 00:56:29,760 --> 00:56:33,540 We have a similar debate about the function of democracy, 1137 00:56:33,540 --> 00:56:38,700 but it is not carried beyond the respective communities. 1138 00:56:38,700 --> 00:56:39,870 Please tell me if I'm wrong. 1139 00:56:39,870 --> 00:56:41,940 I'm happy to hear you point that out to me. 1140 00:56:41,940 --> 00:56:48,030 But as I see it, and as I wish to see it, 1141 00:56:48,030 --> 00:56:50,080 the two communities that I've been mentioning, 1142 00:56:50,080 --> 00:56:52,020 the tech community on the one hand 1143 00:56:52,020 --> 00:56:56,770 and the more philosophical community 1144 00:56:56,770 --> 00:56:59,100 bringing together social scientists and philosophers, 1145 00:56:59,100 --> 00:57:02,120 we don't have a discussion that brings both groups together. 1146 00:57:02,120 --> 00:57:04,120 We haven't been able to link up that discussion. 1147 00:57:04,120 --> 00:57:06,007 Is that impression that I have correct? 1148 00:57:06,007 --> 00:57:07,090 Would you agree with that? 1149 00:57:07,090 --> 00:57:09,630 Is it limited to Germany, or would you 1150 00:57:09,630 --> 00:57:12,150 say that this is also transferable to the debate 1151 00:57:12,150 --> 00:57:13,560 in the United States? 1152 00:57:13,560 --> 00:57:15,550 URS GASSER: Change that right, over at MIT? 1153 00:57:15,550 --> 00:57:17,130 Can you can share your thoughts? 1154 00:57:17,130 --> 00:57:20,680 And then I would like to open up for a number of questions. 1155 00:57:20,680 --> 00:57:22,220 So be ready with your questions. 1156 00:57:22,220 --> 00:57:22,590 MELISSA NOBLES: Sure. 1157 00:57:22,590 --> 00:57:24,450 So the Schwarzman, the College of Computing, 1158 00:57:24,450 --> 00:57:26,140 which was announced last year, is 1159 00:57:26,140 --> 00:57:28,170 intended to get at just this issue, 1160 00:57:28,170 --> 00:57:31,430 that much of the technology that is obviously being created 1161 00:57:31,430 --> 00:57:34,800 at MIT, we recognize that there has 1162 00:57:34,800 --> 00:57:38,255 to be a bridge between technology and the humanities 1163 00:57:38,255 --> 00:57:42,510 and the social sciences in an intentional, deliberate way. 1164 00:57:42,510 --> 00:57:44,640 And part of why the school was established 1165 00:57:44,640 --> 00:57:46,822 is, tons of students are interested in computing. 1166 00:57:46,822 --> 00:57:47,530 They're doing it. 1167 00:57:47,530 --> 00:57:49,920 They're coming in, wanting to major in computer science, 1168 00:57:49,920 --> 00:57:52,260 but many of them don't want to only be computer scientists. 1169 00:57:52,260 --> 00:57:54,427 They want to apply that knowledge to something else, 1170 00:57:54,427 --> 00:57:56,820 but they want it to be guided by some domain knowledge 1171 00:57:56,820 --> 00:57:59,410 outside of computer science. 1172 00:57:59,410 --> 00:58:03,120 And so the goal of the college is to eventually-- we're 1173 00:58:03,120 --> 00:58:06,450 beginning to see joint, blended degrees between computer 1174 00:58:06,450 --> 00:58:09,600 science and economics, computer science and urban studies, 1175 00:58:09,600 --> 00:58:11,940 computer science and music. 1176 00:58:11,940 --> 00:58:15,010 Not all students are doing this, but the interest is great, 1177 00:58:15,010 --> 00:58:18,420 and it's intended to allow for this connection 1178 00:58:18,420 --> 00:58:21,040 in a more organic way from the beginning, 1179 00:58:21,040 --> 00:58:23,617 such that students will have the type of skills, 1180 00:58:23,617 --> 00:58:25,950 so that we won't be talking about disparate communities, 1181 00:58:25,950 --> 00:58:27,570 but that students will have enough 1182 00:58:27,570 --> 00:58:30,780 of an openness, at least an exposure understanding 1183 00:58:30,780 --> 00:58:33,855 that this is what it means to be a computer scientist. 1184 00:58:33,855 --> 00:58:36,280 And conversely, for my own discipline, 1185 00:58:36,280 --> 00:58:38,220 I'm a political scientist, this is 1186 00:58:38,220 --> 00:58:40,160 what it means to be a political scientist, 1187 00:58:40,160 --> 00:58:41,650 is to know something about this. 1188 00:58:41,650 --> 00:58:44,550 So all of us are going to have to learn more and be 1189 00:58:44,550 --> 00:58:48,780 open to learning more, if we're going to successfully deal 1190 00:58:48,780 --> 00:58:50,053 with this issue. 1191 00:58:50,053 --> 00:58:51,720 One other thing I'd like to say about it 1192 00:58:51,720 --> 00:58:55,650 is, we're starting to have these conversations on campus. 1193 00:58:55,650 --> 00:58:58,830 They are not easy conversations to have. 1194 00:58:58,830 --> 00:59:01,560 As much as we try to be collaborative, 1195 00:59:01,560 --> 00:59:04,470 we've really had to work. 1196 00:59:04,470 --> 00:59:06,960 We may be using the same terms, but we 1197 00:59:06,960 --> 00:59:08,820 speak a different language. 1198 00:59:08,820 --> 00:59:11,850 And it requires patience to do this. 1199 00:59:11,850 --> 00:59:14,280 So part of what we're doing is also 1200 00:59:14,280 --> 00:59:17,460 learning some other principles of generosity and patience 1201 00:59:17,460 --> 00:59:18,910 as we deal with one another. 1202 00:59:18,910 --> 00:59:21,320 Because if we want to solve this problem, 1203 00:59:21,320 --> 00:59:24,235 deal with technology in a way that we all want to see, 1204 00:59:24,235 --> 00:59:25,860 then that's what it's going to require, 1205 00:59:25,860 --> 00:59:27,860 some other human qualities that we have to bring 1206 00:59:27,860 --> 00:59:29,710 to bear for this to happen. 1207 00:59:29,710 --> 00:59:32,490 And I think especially just inclusion 1208 00:59:32,490 --> 00:59:35,280 is really important for us for our undergraduates, 1209 00:59:35,280 --> 00:59:39,310 since many of them will be going into leadership positions. 1210 00:59:39,310 --> 00:59:40,750 They know the technologies. 1211 00:59:40,750 --> 00:59:42,750 We need them working on the congressional staff. 1212 00:59:42,750 --> 00:59:46,770 If you all saw the hearings with Mark Zuckerberg 1213 00:59:46,770 --> 00:59:50,933 in the Congress, people didn't know what Google was, right? 1214 00:59:50,933 --> 00:59:52,350 Or they didn't know the difference 1215 00:59:52,350 --> 00:59:53,465 between Samsung and Apple. 1216 00:59:53,465 --> 00:59:54,840 I mean, they didn't know anything 1217 00:59:54,840 --> 00:59:56,280 about the technologies. 1218 00:59:56,280 --> 00:59:59,562 If they don't understand how to open their phones, 1219 00:59:59,562 --> 01:00:01,020 how can you imagine that they could 1220 01:00:01,020 --> 01:00:03,780 be responsible and entrusted to do the kinds of things 1221 01:00:03,780 --> 01:00:05,820 that you all are describing? 1222 01:00:05,820 --> 01:00:09,300 Some of what we also need, our students 1223 01:00:09,300 --> 01:00:13,990 to be able to play those kind of roles, precisely because-- 1224 01:00:13,990 --> 01:00:16,880 but we don't want them to do it only knowing the technology. 1225 01:00:16,880 --> 01:00:19,000 They will also have to understand economics, also 1226 01:00:19,000 --> 01:00:21,160 have to understand political science, and such. 1227 01:00:21,160 --> 01:00:23,650 So that is the task of the college. 1228 01:00:23,650 --> 01:00:25,753 And we're just getting started, so stay tuned. 1229 01:00:25,753 --> 01:00:26,920 URS GASSER: That's exciting. 1230 01:00:26,920 --> 01:00:27,900 That's exciting. 1231 01:00:27,900 --> 01:00:28,950 Thank you. 1232 01:00:28,950 --> 01:00:31,900 OK, let's open up for a few questions. 1233 01:00:31,900 --> 01:00:35,400 [? Becca, ?] our mic-runner is ready and fast on her legs. 1234 01:00:35,400 --> 01:00:36,720 Who has a question? 1235 01:00:36,720 --> 01:00:39,240 And please end with a question mark. 1236 01:00:39,240 --> 01:00:40,492 That would be good. 1237 01:00:40,492 --> 01:00:42,947 [LAUGHTER] 1238 01:00:42,947 --> 01:00:48,840 1239 01:00:48,840 --> 01:00:50,090 AUDIENCE: Me? 1240 01:00:50,090 --> 01:00:52,860 First thank you so much for all the interesting insights 1241 01:00:52,860 --> 01:00:54,690 you shared today. 1242 01:00:54,690 --> 01:00:57,420 My question is, regarding the fact that a lot of you 1243 01:00:57,420 --> 01:00:59,790 mentioned today that there's an urgency 1244 01:00:59,790 --> 01:01:03,840 to craft tech specific ethics regulations as soon 1245 01:01:03,840 --> 01:01:05,080 as possible. 1246 01:01:05,080 --> 01:01:07,470 So in a way, this is really a moral discussion 1247 01:01:07,470 --> 01:01:08,760 with a deadline. 1248 01:01:08,760 --> 01:01:11,130 When would you say is this deadline? 1249 01:01:11,130 --> 01:01:17,340 When do we have to formalize our thoughts and put it into law? 1250 01:01:17,340 --> 01:01:20,700 EVA WEBER-GURSKA: Oh, was that a question for me? 1251 01:01:20,700 --> 01:01:22,064 URS GASSER: Now it is, yes. 1252 01:01:22,064 --> 01:01:23,920 [LAUGHTER] 1253 01:01:23,920 --> 01:01:28,740 EVA WEBER-GURSKA: No, of course I think the deadline is-- 1254 01:01:28,740 --> 01:01:32,010 it's not far ahead, but it's just right now. 1255 01:01:32,010 --> 01:01:35,580 But there are already a lot of ethic guidelines 1256 01:01:35,580 --> 01:01:36,960 being written right now. 1257 01:01:36,960 --> 01:01:40,455 So we really have on national level a different one 1258 01:01:40,455 --> 01:01:40,955 in Germany. 1259 01:01:40,955 --> 01:01:42,720 Then there are international levels, 1260 01:01:42,720 --> 01:01:45,150 like, for example, the high level export 1261 01:01:45,150 --> 01:01:51,600 group that got tasked by the European Commission. 1262 01:01:51,600 --> 01:01:53,160 And they wrote something, and then 1263 01:01:53,160 --> 01:01:56,880 we have already part of this ethic guidelines, for example, 1264 01:01:56,880 --> 01:02:01,390 extracted, and the G20 group signed it. 1265 01:02:01,390 --> 01:02:04,610 And so there are already guidelines, 1266 01:02:04,610 --> 01:02:08,140 but I think the guidelines are only the first step. 1267 01:02:08,140 --> 01:02:11,400 And then, of course, they're our first step 1268 01:02:11,400 --> 01:02:14,320 to transform them into law, as we heard already 1269 01:02:14,320 --> 01:02:15,840 with the [GERMAN]. 1270 01:02:15,840 --> 01:02:21,460 And so yeah, it's right now, and we should go further ahead. 1271 01:02:21,460 --> 01:02:24,425 But it's still already something happening, I think. 1272 01:02:24,425 --> 01:02:25,300 URS GASSER: Wolfgang? 1273 01:02:25,300 --> 01:02:27,717 WOLFGANG SCHULZ: Yeah, maybe I can add a legal perspective 1274 01:02:27,717 --> 01:02:30,480 to that, because one of the problems of the law 1275 01:02:30,480 --> 01:02:34,900 is that the function of law is to have stability. 1276 01:02:34,900 --> 01:02:38,350 And what we need here is some flexibility as well. 1277 01:02:38,350 --> 01:02:40,830 And so what we are struggling with, as legal scholars, 1278 01:02:40,830 --> 01:02:44,910 is to find ways to make law more flexible, 1279 01:02:44,910 --> 01:02:47,780 to have constant evaluation, to have sunset clauses 1280 01:02:47,780 --> 01:02:51,343 and things like that, so that we do not have to wait. 1281 01:02:51,343 --> 01:02:53,760 I think it was Susan Crawford said the marvelous sentence, 1282 01:02:53,760 --> 01:02:57,130 we have to regulate things that we don't understand. 1283 01:02:57,130 --> 01:03:01,410 And I think we are-- can't wait until all of the lawmakers 1284 01:03:01,410 --> 01:03:04,560 have understood what it actually is. 1285 01:03:04,560 --> 01:03:05,910 You have to act before. 1286 01:03:05,910 --> 01:03:09,000 But then we need different instruments, especially when 1287 01:03:09,000 --> 01:03:11,460 you take into account what you said, 1288 01:03:11,460 --> 01:03:14,130 that when you are talking with lawmakers, even if they really 1289 01:03:14,130 --> 01:03:17,250 try hard, the problems are really high tech. 1290 01:03:17,250 --> 01:03:19,840 Only a couple of people really understands what's happening 1291 01:03:19,840 --> 01:03:23,820 there, and even the best member of parliament cannot 1292 01:03:23,820 --> 01:03:25,480 be an expert in this field. 1293 01:03:25,480 --> 01:03:29,850 So we need constant evaluation and some mechanism 1294 01:03:29,850 --> 01:03:32,360 seems to deal with. 1295 01:03:32,360 --> 01:03:35,360 Every day, even we as researchers, 1296 01:03:35,360 --> 01:03:39,290 every week I have a new understanding of how algorithms 1297 01:03:39,290 --> 01:03:40,650 interact with society. 1298 01:03:40,650 --> 01:03:43,220 Every meeting, I have an interaction with some software 1299 01:03:43,220 --> 01:03:45,590 engineer and oh, OK, it's a little bit different 1300 01:03:45,590 --> 01:03:46,670 than I thought before. 1301 01:03:46,670 --> 01:03:50,420 And on this basis to create law, that's, I would say, 1302 01:03:50,420 --> 01:03:52,955 a fundamentally new challenge. 1303 01:03:52,955 --> 01:03:54,705 FRANK-WALTER STEINMEIER: [SPEAKING GERMAN] 1304 01:03:54,705 --> 01:03:57,810 INTERPRETER: But if I may add to what you've just said, 1305 01:03:57,810 --> 01:03:59,510 the situation is changing. 1306 01:03:59,510 --> 01:04:01,950 When you take a look at the German legal culture, 1307 01:04:01,950 --> 01:04:04,430 you know, it's based on the assumption 1308 01:04:04,430 --> 01:04:06,980 that a law that is passed is, until kingdom come, 1309 01:04:06,980 --> 01:04:09,140 it's for eternity, you know? 1310 01:04:09,140 --> 01:04:12,960 When we know look at the area of internet law, 1311 01:04:12,960 --> 01:04:16,910 something interesting is happening that is 1312 01:04:16,910 --> 01:04:18,440 quite difficult, complicated. 1313 01:04:18,440 --> 01:04:21,380 When you look to the relationship between the one 1314 01:04:21,380 --> 01:04:24,230 passing the law, making the law, and the public, 1315 01:04:24,230 --> 01:04:28,070 you know, public comments about the Network Enforcement 1316 01:04:28,070 --> 01:04:31,220 Act in Germany, for example, is an indication of that. 1317 01:04:31,220 --> 01:04:34,190 In some areas of legislation, we have already reached a point 1318 01:04:34,190 --> 01:04:40,160 where we can no longer provide an eternal guarantee 1319 01:04:40,160 --> 01:04:42,130 for the legislation that is being passed. 1320 01:04:42,130 --> 01:04:46,130 We are taking one step, hesitatingly after the other, 1321 01:04:46,130 --> 01:04:50,540 carefully trying out to see how that intervention is 1322 01:04:50,540 --> 01:04:52,950 going to affect the reality in the future. 1323 01:04:52,950 --> 01:04:56,000 This careful and cautious, tentative approach 1324 01:04:56,000 --> 01:05:00,555 in legislation that is taking place right now, 1325 01:05:00,555 --> 01:05:02,930 it's not something that is being greeted with enthusiasm. 1326 01:05:02,930 --> 01:05:05,660 And I do understand, but there might be no alternative 1327 01:05:05,660 --> 01:05:06,950 to that approach, you know. 1328 01:05:06,950 --> 01:05:10,490 Trying to time and again refer back 1329 01:05:10,490 --> 01:05:13,120 from the instruments to the technology and vice versa 1330 01:05:13,120 --> 01:05:15,260 and to amend things when necessary. 1331 01:05:15,260 --> 01:05:15,475 URS GASSER: One of the dark sides 1332 01:05:15,475 --> 01:05:17,183 of having such an amazing group of people 1333 01:05:17,183 --> 01:05:21,200 is, we could discuss 20 minutes just one question. 1334 01:05:21,200 --> 01:05:24,570 So you want to jump in quickly with comments, Jeanette 1335 01:05:24,570 --> 01:05:26,245 and Matthew? 1336 01:05:26,245 --> 01:05:28,370 JEANETTE HOFMANN: I thought perhaps one way forward 1337 01:05:28,370 --> 01:05:33,860 could be to pursue a more procedural approach 1338 01:05:33,860 --> 01:05:35,300 to these problems. 1339 01:05:35,300 --> 01:05:39,020 For example, think of ways to holding companies 1340 01:05:39,020 --> 01:05:42,140 accountable to the kinds of technology development 1341 01:05:42,140 --> 01:05:44,250 they tried to bring to the market, 1342 01:05:44,250 --> 01:05:47,780 introducing auditing requirements 1343 01:05:47,780 --> 01:05:50,810 for certain type of algorithms. 1344 01:05:50,810 --> 01:05:56,270 Make it mandatory to only use in certain areas machine 1345 01:05:56,270 --> 01:05:58,790 learning systems that are self-explainable, 1346 01:05:58,790 --> 01:06:03,680 that explain, in at least basic ways, 1347 01:06:03,680 --> 01:06:06,590 how they come to certain recommendations 1348 01:06:06,590 --> 01:06:07,790 and predictions. 1349 01:06:07,790 --> 01:06:09,950 That seems to be a way forward, rather 1350 01:06:09,950 --> 01:06:14,397 than relying just on rules. 1351 01:06:14,397 --> 01:06:16,230 MATTHEW LIAO: Maybe I can also jump in here. 1352 01:06:16,230 --> 01:06:19,730 So I think Professor Gasser's mentioned 1353 01:06:19,730 --> 01:06:25,160 that there are about 130 ethical principles that 1354 01:06:25,160 --> 01:06:28,970 are being presented by different companies 1355 01:06:28,970 --> 01:06:30,150 and so on and so forth. 1356 01:06:30,150 --> 01:06:34,500 I think what we really also need is a rationale, 1357 01:06:34,500 --> 01:06:35,870 or philosophical-- 1358 01:06:35,870 --> 01:06:38,180 this is a plug for philosophers-- 1359 01:06:38,180 --> 01:06:41,890 a philosophical justification for some of these principles. 1360 01:06:41,890 --> 01:06:44,270 So for example, they talk about-- a lot 1361 01:06:44,270 --> 01:06:47,390 of these principles say things like "we need explainability." 1362 01:06:47,390 --> 01:06:48,860 Why do we need explainability? 1363 01:06:48,860 --> 01:06:55,280 I mean, we've heard some of the panelists asking this question 1364 01:06:55,280 --> 01:06:56,760 and various other things. 1365 01:06:56,760 --> 01:06:59,690 And so here in that line, I'm very sympathetic to what 1366 01:06:59,690 --> 01:07:01,070 Professor Hofmann's saying, which 1367 01:07:01,070 --> 01:07:07,250 is this idea of the human rights framework, which says that, we 1368 01:07:07,250 --> 01:07:08,970 need to look towards the goal. 1369 01:07:08,970 --> 01:07:10,760 What are these algorithms for? 1370 01:07:10,760 --> 01:07:14,280 Fundamentally, they are about promoting human well-being, 1371 01:07:14,280 --> 01:07:14,780 right? 1372 01:07:14,780 --> 01:07:17,540 We want to make sure that we have a harmonious society, one 1373 01:07:17,540 --> 01:07:19,950 that works towards all of us. 1374 01:07:19,950 --> 01:07:22,130 And so a human rights framework, I think, 1375 01:07:22,130 --> 01:07:26,322 can really move towards that goal. 1376 01:07:26,322 --> 01:07:27,530 And there's a rich tradition. 1377 01:07:27,530 --> 01:07:29,270 There's a rich literature on these 1378 01:07:29,270 --> 01:07:32,720 philosophical justifications of the different rights. 1379 01:07:32,720 --> 01:07:35,972 And they go beyond just discrimination, right? 1380 01:07:35,972 --> 01:07:37,430 They say-- they're positive rights. 1381 01:07:37,430 --> 01:07:41,540 They're rights where we just-- it's not just 1382 01:07:41,540 --> 01:07:44,180 about making sure that you don't discriminate, by making sure 1383 01:07:44,180 --> 01:07:47,420 that your technologies also work to help people 1384 01:07:47,420 --> 01:07:48,930 and so on and so forth. 1385 01:07:48,930 --> 01:07:50,600 And the other thing about human rights 1386 01:07:50,600 --> 01:07:54,720 is that it's an obligation on everybody. 1387 01:07:54,720 --> 01:07:57,380 So it's not just an obligation on the engineers. 1388 01:07:57,380 --> 01:07:59,880 It's not just an obligation on the company. 1389 01:07:59,880 --> 01:08:02,430 It's not just an obligation on the government. 1390 01:08:02,430 --> 01:08:04,170 It's an obligation on all of us. 1391 01:08:04,170 --> 01:08:05,720 We need to collectively make sure 1392 01:08:05,720 --> 01:08:09,680 that we're working towards making sure 1393 01:08:09,680 --> 01:08:13,947 that these technologies work well for everybody. 1394 01:08:13,947 --> 01:08:14,780 URS GASSER: Crystal. 1395 01:08:14,780 --> 01:08:16,040 CRYSTAL YANG: I promise to be very brief. 1396 01:08:16,040 --> 01:08:17,770 I know there are a lot of hands up. 1397 01:08:17,770 --> 01:08:19,359 This is such a fascinating question. 1398 01:08:19,359 --> 01:08:22,077 I think, I wholeheartedly endorse the perspectives others 1399 01:08:22,077 --> 01:08:23,660 have raised, especially Mr. President, 1400 01:08:23,660 --> 01:08:25,520 that the laws must be adaptive. 1401 01:08:25,520 --> 01:08:28,640 They must be flexible, because we are still learning how they 1402 01:08:28,640 --> 01:08:30,939 work, how the algorithms work. 1403 01:08:30,939 --> 01:08:34,090 And so we have to also study when a new regulation goes 1404 01:08:34,090 --> 01:08:35,062 into effect. 1405 01:08:35,062 --> 01:08:36,520 What does that mean about the types 1406 01:08:36,520 --> 01:08:39,240 of algorithms we're seeing that flourish after that? 1407 01:08:39,240 --> 01:08:42,670 What types of algorithms are now disappearing as a result? 1408 01:08:42,670 --> 01:08:44,270 To that point about explainability, 1409 01:08:44,270 --> 01:08:48,069 I think we all have this desire to understand 1410 01:08:48,069 --> 01:08:49,939 what the algorithm is doing. 1411 01:08:49,939 --> 01:08:51,700 And so we often, then, might shift 1412 01:08:51,700 --> 01:08:55,069 towards regulation or principles that the algorithm 1413 01:08:55,069 --> 01:08:56,380 be explainable. 1414 01:08:56,380 --> 01:08:58,090 The complication there is, now there's 1415 01:08:58,090 --> 01:09:01,330 emerging work coming from computer science and economics, 1416 01:09:01,330 --> 01:09:04,540 showing that when you force an algorithm to be explainable, 1417 01:09:04,540 --> 01:09:06,850 you generally will choose an algorithm that's 1418 01:09:06,850 --> 01:09:10,220 simpler, because it has to be easier to understand. 1419 01:09:10,220 --> 01:09:13,450 But a simpler algorithm, as it turns out in some contexts, 1420 01:09:13,450 --> 01:09:16,569 actually can lead to both less efficient results 1421 01:09:16,569 --> 01:09:20,140 and less equitable results, which again raises a conundrum 1422 01:09:20,140 --> 01:09:22,689 that I think no field alone can address, 1423 01:09:22,689 --> 01:09:25,899 but just reveals that there are inherent trade-offs 1424 01:09:25,899 --> 01:09:28,899 every time we make a choice, like explainability. 1425 01:09:28,899 --> 01:09:30,729 And we have to confront those trade-offs 1426 01:09:30,729 --> 01:09:33,880 and decide how do we weigh competing values, which 1427 01:09:33,880 --> 01:09:36,690 are inevitably going to be at stake. 1428 01:09:36,690 --> 01:09:37,640 URS GASSER: Great. 1429 01:09:37,640 --> 01:09:40,151 So let's collect three questions, 1430 01:09:40,151 --> 01:09:41,109 and then we'll respond. 1431 01:09:41,109 --> 01:09:42,819 One question here, and then maybe one 1432 01:09:42,819 --> 01:09:44,430 from over here, this area. 1433 01:09:44,430 --> 01:09:47,529 AUDIENCE: Yeah, I'd like to add an observation. 1434 01:09:47,529 --> 01:09:51,010 I think it's also observable on the podium 1435 01:09:51,010 --> 01:09:54,069 that we are missing economists, and we are 1436 01:09:54,069 --> 01:09:55,780 missing behavioral scientists. 1437 01:09:55,780 --> 01:09:58,120 And it seems to me that these two components are 1438 01:09:58,120 --> 01:10:02,710 crucial in understanding the impact that AI has had 1439 01:10:02,710 --> 01:10:05,680 and will have on our society and each of us. 1440 01:10:05,680 --> 01:10:06,740 Why do I say this? 1441 01:10:06,740 --> 01:10:10,540 Because AI has enormous economic potency. 1442 01:10:10,540 --> 01:10:13,900 In this country, the majority of the productivity 1443 01:10:13,900 --> 01:10:16,060 of this country comes from AI. 1444 01:10:16,060 --> 01:10:19,390 And why is it that Facebook and Google and other companies 1445 01:10:19,390 --> 01:10:23,080 have been doing undaunted what they have been doing 1446 01:10:23,080 --> 01:10:25,370 is because exactly of that. 1447 01:10:25,370 --> 01:10:28,160 And so that is one reality that we have to face. 1448 01:10:28,160 --> 01:10:32,560 And this reality is deeply immersed in research as well. 1449 01:10:32,560 --> 01:10:34,840 Where is most of our funding going? 1450 01:10:34,840 --> 01:10:38,770 It is going to computer science, to computer engineering. 1451 01:10:38,770 --> 01:10:41,050 And then we have some alibi, excuse 1452 01:10:41,050 --> 01:10:43,810 the term, addition of social sciences, 1453 01:10:43,810 --> 01:10:46,360 and if we are lucky, behavioral sciences. 1454 01:10:46,360 --> 01:10:47,000 There's no-- 1455 01:10:47,000 --> 01:10:47,150 URS GASSER: Thank you. 1456 01:10:47,150 --> 01:10:48,370 Sorry, we have to stop there. 1457 01:10:48,370 --> 01:10:49,995 AUDIENCE: It's just-- it's very-- well, 1458 01:10:49,995 --> 01:10:51,910 we've heard a lot from the panel. 1459 01:10:51,910 --> 01:10:52,680 [LAUGHTER] 1460 01:10:52,680 --> 01:10:53,680 URS GASSER: Fair enough. 1461 01:10:53,680 --> 01:10:57,010 AUDIENCE: I'm sorry, but there is no level playing 1462 01:10:57,010 --> 01:10:59,770 field between the behavioral sciences 1463 01:10:59,770 --> 01:11:02,380 and all the psychological dynamics that 1464 01:11:02,380 --> 01:11:05,950 are opened up by AI and computer scientists 1465 01:11:05,950 --> 01:11:07,420 and computer engineering. 1466 01:11:07,420 --> 01:11:11,170 Unless we change these funding structures-- and we've 1467 01:11:11,170 --> 01:11:14,290 heard a lot about the necessity for other regulations 1468 01:11:14,290 --> 01:11:16,570 for companies, but these funding structures 1469 01:11:16,570 --> 01:11:18,880 have enormous consequences. 1470 01:11:18,880 --> 01:11:20,860 President Steinmeier was asking, is there 1471 01:11:20,860 --> 01:11:24,460 any example about, at the beginning 1472 01:11:24,460 --> 01:11:27,580 of such an enterprise, to bring disciplines together? 1473 01:11:27,580 --> 01:11:28,720 I would say, yes. 1474 01:11:28,720 --> 01:11:31,240 And actually at the Ruhr Universitat Bochum, 1475 01:11:31,240 --> 01:11:33,550 there is one competence cluster that 1476 01:11:33,550 --> 01:11:36,520 focuses on cybersecurity, which tries 1477 01:11:36,520 --> 01:11:40,000 to give level-headed, equal importance 1478 01:11:40,000 --> 01:11:43,240 to social behavioral science on the one, economics 1479 01:11:43,240 --> 01:11:46,390 and computer science and computer engineering. 1480 01:11:46,390 --> 01:11:48,250 So I just hope that in the future, 1481 01:11:48,250 --> 01:11:52,780 such discussions move beyond very important contributions 1482 01:11:52,780 --> 01:11:56,410 from philosophers, ethicists, and lawyers 1483 01:11:56,410 --> 01:12:00,755 to also have a brighter view. 1484 01:12:00,755 --> 01:12:03,210 [APPLAUSE] 1485 01:12:03,210 --> 01:12:05,233 1486 01:12:05,233 --> 01:12:06,650 URS GASSER: Not really a question. 1487 01:12:06,650 --> 01:12:08,060 AUDIENCE: I promise to be short. 1488 01:12:08,060 --> 01:12:10,940 URS GASSER: Back up. 1489 01:12:10,940 --> 01:12:12,600 We collect there, please. 1490 01:12:12,600 --> 01:12:13,310 Go ahead, please. 1491 01:12:13,310 --> 01:12:14,685 AUDIENCE: I'll keep myself short. 1492 01:12:14,685 --> 01:12:15,810 URS GASSER: Yup. 1493 01:12:15,810 --> 01:12:17,270 AUDIENCE: As we are at a German-American conference 1494 01:12:17,270 --> 01:12:19,820 right now, I just want to ask, how does the transatlantic 1495 01:12:19,820 --> 01:12:23,580 relationship help us in solving all these challenges? 1496 01:12:23,580 --> 01:12:26,280 What is needed for an effective transatlantic relationship, 1497 01:12:26,280 --> 01:12:28,340 especially the German-American one, 1498 01:12:28,340 --> 01:12:30,500 to solve all these challenges together, 1499 01:12:30,500 --> 01:12:34,850 as like a world and not as like separate states? 1500 01:12:34,850 --> 01:12:35,780 URS GASSER: OK. 1501 01:12:35,780 --> 01:12:36,920 Maybe one or two more? 1502 01:12:36,920 --> 01:12:37,580 Yes, please. 1503 01:12:37,580 --> 01:12:41,150 1504 01:12:41,150 --> 01:12:42,352 AUDIENCE: I have a question. 1505 01:12:42,352 --> 01:12:44,040 [LAUGHTER] 1506 01:12:44,040 --> 01:12:45,470 URS GASSER: I appreciate that. 1507 01:12:45,470 --> 01:12:49,660 AUDIENCE: Do you think, talking about social media, 1508 01:12:49,660 --> 01:12:51,140 that the time will come that there 1509 01:12:51,140 --> 01:12:54,980 will be a reliable algorithm to identify hate speech? 1510 01:12:54,980 --> 01:12:58,580 1511 01:12:58,580 --> 01:13:00,317 URS GASSER: OK, over there. 1512 01:13:00,317 --> 01:13:00,900 Last question. 1513 01:13:00,900 --> 01:13:03,120 AUDIENCE: I promised a short question. 1514 01:13:03,120 --> 01:13:05,540 AUDIENCE: I'm one of those computer scientists writing 1515 01:13:05,540 --> 01:13:08,360 those messy algorithms, and I know 1516 01:13:08,360 --> 01:13:09,900 that there are people in my field 1517 01:13:09,900 --> 01:13:11,870 who think very critical about this. 1518 01:13:11,870 --> 01:13:13,560 And I know there's a lot of discussion. 1519 01:13:13,560 --> 01:13:16,040 So I can convince you there are people 1520 01:13:16,040 --> 01:13:18,690 behind the current talking about the things. 1521 01:13:18,690 --> 01:13:21,650 How can we reach out to other people 1522 01:13:21,650 --> 01:13:23,090 who are thinking about this? 1523 01:13:23,090 --> 01:13:25,650 1524 01:13:25,650 --> 01:13:27,810 AUDIENCE: So my question is short. 1525 01:13:27,810 --> 01:13:29,080 Thank you all. 1526 01:13:29,080 --> 01:13:31,900 This was delightful. 1527 01:13:31,900 --> 01:13:35,880 I would like to understand, are we being human race empowered 1528 01:13:35,880 --> 01:13:36,840 by technology? 1529 01:13:36,840 --> 01:13:39,660 Or are we powering technology by humans? 1530 01:13:39,660 --> 01:13:43,290 1531 01:13:43,290 --> 01:13:44,040 URS GASSER: Great. 1532 01:13:44,040 --> 01:13:45,090 All right. 1533 01:13:45,090 --> 01:13:48,210 We have 12 minutes left. 1534 01:13:48,210 --> 01:13:49,620 And I'm Swiss, as I said. 1535 01:13:49,620 --> 01:13:51,960 So I want to end on time. 1536 01:13:51,960 --> 01:13:56,290 So what I would suggest is that we actually do a closing round 1537 01:13:56,290 --> 01:14:00,510 and pick the question that you would like to address, 1538 01:14:00,510 --> 01:14:03,360 but put it into the context also of your work 1539 01:14:03,360 --> 01:14:05,470 and what we've discussed here. 1540 01:14:05,470 --> 01:14:08,490 So we have the question of transatlantic relationships. 1541 01:14:08,490 --> 01:14:13,016 We have the question around social media 1542 01:14:13,016 --> 01:14:15,710 and the role of technology in creating 1543 01:14:15,710 --> 01:14:19,890 a safer environment using the example of hate speech. 1544 01:14:19,890 --> 01:14:22,440 And we have this ultimate question, 1545 01:14:22,440 --> 01:14:25,440 is technology empowering people, or are people here 1546 01:14:25,440 --> 01:14:27,670 somehow to empower technology. 1547 01:14:27,670 --> 01:14:31,080 So these are a few of the themes. 1548 01:14:31,080 --> 01:14:33,430 Perhaps we start with Eva. 1549 01:14:33,430 --> 01:14:35,430 EVA WEBER-GURSKA: Mm-hm. 1550 01:14:35,430 --> 01:14:39,060 Yeah, maybe to the two last questions. 1551 01:14:39,060 --> 01:14:41,790 Of course, I think that the technology should empower 1552 01:14:41,790 --> 01:14:45,383 people, but for that human machine interaction, 1553 01:14:45,383 --> 01:14:47,300 it is important that we understand each other, 1554 01:14:47,300 --> 01:14:48,425 as we already talked about. 1555 01:14:48,425 --> 01:14:54,000 And maybe just into two aspects that philosophy can contribute 1556 01:14:54,000 --> 01:14:59,040 here, and the one is, it's not only about explainability 1557 01:14:59,040 --> 01:15:01,020 that you said, and it's not only what is also 1558 01:15:01,020 --> 01:15:04,780 important about the ethic or moral justification of why it's 1559 01:15:04,780 --> 01:15:08,760 so important to explain, but also to see that in morality, 1560 01:15:08,760 --> 01:15:10,750 for example, it's about reason giving. 1561 01:15:10,750 --> 01:15:16,530 So the whole validity of moral norms 1562 01:15:16,530 --> 01:15:18,990 depends, I think, on the fact that they 1563 01:15:18,990 --> 01:15:21,450 exist between beings that can give 1564 01:15:21,450 --> 01:15:23,430 reasons and understand reasons. 1565 01:15:23,430 --> 01:15:28,095 And so one question would be, do we want algorithmic structures 1566 01:15:28,095 --> 01:15:32,250 who cannot give reasons in an empathic sense, for example. 1567 01:15:32,250 --> 01:15:35,700 And another topic would be the topic of trust, 1568 01:15:35,700 --> 01:15:39,150 because the ethic guidelines often highlight 1569 01:15:39,150 --> 01:15:41,870 trustworthy AI as a claim. 1570 01:15:41,870 --> 01:15:46,080 I would also be skeptical of this as a best aim, 1571 01:15:46,080 --> 01:15:48,960 because trustworthiness presupposes 1572 01:15:48,960 --> 01:15:53,730 also being a moral subject, because trust 1573 01:15:53,730 --> 01:15:57,900 means to believe that someone will hold to his or her 1574 01:15:57,900 --> 01:15:59,500 commitment to do something. 1575 01:15:59,500 --> 01:16:01,360 And this is also something that is only 1576 01:16:01,360 --> 01:16:03,040 possible for a moral subject. 1577 01:16:03,040 --> 01:16:07,340 So AI systems cannot be trustworthy agents or subjects. 1578 01:16:07,340 --> 01:16:08,257 URS GASSER: Thank you. 1579 01:16:08,257 --> 01:16:09,450 Matthew? 1580 01:16:09,450 --> 01:16:12,270 MATTHEW LIAO: Yes, I'll take the question on hate speech. 1581 01:16:12,270 --> 01:16:17,430 So I mean, there are some attempts using machine learning 1582 01:16:17,430 --> 01:16:22,170 algorithms to detect things like fake news and things like that, 1583 01:16:22,170 --> 01:16:25,410 but I actually want to give you a very grim picture. 1584 01:16:25,410 --> 01:16:28,710 This is like election 2.0, since we're coming up 1585 01:16:28,710 --> 01:16:30,580 to another election cycle. 1586 01:16:30,580 --> 01:16:31,740 So there's some evidence. 1587 01:16:31,740 --> 01:16:33,720 There's something called deep fake, which 1588 01:16:33,720 --> 01:16:39,510 is being able to produce all these videos that just look 1589 01:16:39,510 --> 01:16:42,660 like they can superimpose your photo 1590 01:16:42,660 --> 01:16:44,860 or you onto another video. 1591 01:16:44,860 --> 01:16:48,930 And then they can get you to talk and do various things. 1592 01:16:48,930 --> 01:16:51,450 And what people are finding is that-- 1593 01:16:51,450 --> 01:16:54,240 so there's the theory that you tend to vote 1594 01:16:54,240 --> 01:16:56,895 for people who look like you. 1595 01:16:56,895 --> 01:17:01,860 OK, and so now they're creating deep fake videos, where 1596 01:17:01,860 --> 01:17:07,170 they super impose your picture onto a candidate's picture, 1597 01:17:07,170 --> 01:17:08,160 so it looks like you. 1598 01:17:08,160 --> 01:17:11,250 And now supposedly that's going to influence your voting 1599 01:17:11,250 --> 01:17:13,320 behavior, because you're more likely to vote 1600 01:17:13,320 --> 01:17:14,850 for people who look like you. 1601 01:17:14,850 --> 01:17:19,860 And so that's going to be very worrying in the future. 1602 01:17:19,860 --> 01:17:23,040 And then the question is, we're going to get to a point 1603 01:17:23,040 --> 01:17:25,490 where it's going to be very hard for human eyes 1604 01:17:25,490 --> 01:17:27,420 to be able to detect those differences, 1605 01:17:27,420 --> 01:17:29,690 and that's going to be very worrying. 1606 01:17:29,690 --> 01:17:30,950 URS GASSER: Jeanette. 1607 01:17:30,950 --> 01:17:32,700 JEANETTE HOFMANN: I'd also like to pick up 1608 01:17:32,700 --> 01:17:35,860 the question on hate speech. 1609 01:17:35,860 --> 01:17:38,460 What I find really good about this question 1610 01:17:38,460 --> 01:17:40,710 is because we have so many examples 1611 01:17:40,710 --> 01:17:49,110 that show how deeply ambiguous we are about such wording. 1612 01:17:49,110 --> 01:17:52,840 Facebook once told me the example of the term "bitch." 1613 01:17:52,840 --> 01:17:54,690 "Bitch" can be really dismissive, 1614 01:17:54,690 --> 01:17:56,610 when you call a woman a bitch. 1615 01:17:56,610 --> 01:17:58,890 But nowadays, in some circles, "bitch" 1616 01:17:58,890 --> 01:18:00,600 can also be appreciative. 1617 01:18:00,600 --> 01:18:05,070 Women might refer to each other as bitches. 1618 01:18:05,070 --> 01:18:07,413 How is Facebook supposed to regulate-- 1619 01:18:07,413 --> 01:18:09,330 [? AUDIENCE: ?] Especially on [? Halloween. ?] 1620 01:18:09,330 --> 01:18:11,122 JEANETTE HOFMANN: I mean, how is Facebook-- 1621 01:18:11,122 --> 01:18:12,960 URS GASSER: I did not see that coming. 1622 01:18:12,960 --> 01:18:13,890 JEANETTE HOFMANN: --supposed to regulate wordings that 1623 01:18:13,890 --> 01:18:15,870 have so different meanings? 1624 01:18:15,870 --> 01:18:18,930 And that, I think, shows also the limit, 1625 01:18:18,930 --> 01:18:23,860 the limits of technical filtering of language. 1626 01:18:23,860 --> 01:18:26,400 Language is changing all the time, 1627 01:18:26,400 --> 01:18:29,730 and it differs across cultures also very much. 1628 01:18:29,730 --> 01:18:31,050 So there are really limits. 1629 01:18:31,050 --> 01:18:35,940 Another point, if I may, the question of empowering 1630 01:18:35,940 --> 01:18:38,130 versus disempowering. 1631 01:18:38,130 --> 01:18:41,500 I really like this question, because it implicitly refers 1632 01:18:41,500 --> 01:18:44,240 to autonomy of human beings. 1633 01:18:44,240 --> 01:18:47,710 I think it's a mistake to think autonomy needs to be 1634 01:18:47,710 --> 01:18:49,630 defended against technology. 1635 01:18:49,630 --> 01:18:53,800 Technology, in many ways, enhances our autonomy. 1636 01:18:53,800 --> 01:18:55,700 Think of flying around. 1637 01:18:55,700 --> 01:18:56,860 Think of your watch. 1638 01:18:56,860 --> 01:18:59,170 I mean, we coordinate as societies 1639 01:18:59,170 --> 01:19:00,940 through these technologies. 1640 01:19:00,940 --> 01:19:03,770 And at the same time, they are disciplining ourselves. 1641 01:19:03,770 --> 01:19:05,290 So it's not an either/or. 1642 01:19:05,290 --> 01:19:09,040 And technologies and human beings are not opposites. 1643 01:19:09,040 --> 01:19:12,190 It's the matter of how we structure and shape 1644 01:19:12,190 --> 01:19:15,470 the relationship between the two. 1645 01:19:15,470 --> 01:19:17,910 URS GASSER: Dean Nobles, is it OK if we go last with you? 1646 01:19:17,910 --> 01:19:19,658 Mr. President, Dean Nobles. 1647 01:19:19,658 --> 01:19:21,700 MELISSA NOBLES: Sure, I'll just take the question 1648 01:19:21,700 --> 01:19:23,710 of the computer scientist who said 1649 01:19:23,710 --> 01:19:26,620 these kind of conversations are also happening 1650 01:19:26,620 --> 01:19:29,170 among computer scientists. 1651 01:19:29,170 --> 01:19:33,670 But we need to get better connection with others, who are 1652 01:19:33,670 --> 01:19:35,380 thinking critically about it. 1653 01:19:35,380 --> 01:19:38,600 I think that obviously education plays a hugely important role 1654 01:19:38,600 --> 01:19:39,100 in this. 1655 01:19:39,100 --> 01:19:41,440 That is, early on getting students 1656 01:19:41,440 --> 01:19:43,360 of different disciplines to work together 1657 01:19:43,360 --> 01:19:45,880 and to learn together in a way that addresses 1658 01:19:45,880 --> 01:19:48,250 these questions exactly. 1659 01:19:48,250 --> 01:19:50,530 The challenge of all knowledge is making sure 1660 01:19:50,530 --> 01:19:52,900 that it doesn't stay siloed, that we work 1661 01:19:52,900 --> 01:19:55,180 in a truly collaborative way. 1662 01:19:55,180 --> 01:19:57,920 And it seems to me that's the challenge for the 21st century. 1663 01:19:57,920 --> 01:20:01,220 1664 01:20:01,220 --> 01:20:02,780 WOLFGANG SCHULZ: Yeah, I think I'll 1665 01:20:02,780 --> 01:20:07,140 pick the comment, if I may, on the funding 1666 01:20:07,140 --> 01:20:09,020 and interdisciplinary research. 1667 01:20:09,020 --> 01:20:11,660 I think we talk a lot about interdisciplinary research, 1668 01:20:11,660 --> 01:20:13,760 and we need it to solve problems, 1669 01:20:13,760 --> 01:20:15,920 but the academic system is not really 1670 01:20:15,920 --> 01:20:18,740 designed to cater that need. 1671 01:20:18,740 --> 01:20:20,180 We still have problems with that, 1672 01:20:20,180 --> 01:20:23,110 and I constantly get phone calls from colleagues 1673 01:20:23,110 --> 01:20:27,070 that want to apply for project funding next week. 1674 01:20:27,070 --> 01:20:29,790 And they said, oh, we have just seen we need some ethics in it, 1675 01:20:29,790 --> 01:20:32,870 and we need a lawyer or something like that. 1676 01:20:32,870 --> 01:20:34,590 Would you be available? 1677 01:20:34,590 --> 01:20:37,190 And normally I'd say no, because it 1678 01:20:37,190 --> 01:20:39,640 has to be part of the project question 1679 01:20:39,640 --> 01:20:41,900 and not just an icing on the cake that 1680 01:20:41,900 --> 01:20:44,990 has already been baked. 1681 01:20:44,990 --> 01:20:46,590 That makes no sense. 1682 01:20:46,590 --> 01:20:50,030 And so I think we have issues here in the academic system. 1683 01:20:50,030 --> 01:20:53,120 And maybe 30 seconds on the transatlantic issue. 1684 01:20:53,120 --> 01:20:59,060 I think it's really helpful and good for these questions 1685 01:20:59,060 --> 01:21:01,040 that there are really stable research 1686 01:21:01,040 --> 01:21:03,860 relationships between our American colleagues 1687 01:21:03,860 --> 01:21:06,755 and the research in Germany. 1688 01:21:06,755 --> 01:21:09,050 It's really great, and that survives 1689 01:21:09,050 --> 01:21:13,390 even if there is a political winter or autumn, 1690 01:21:13,390 --> 01:21:14,858 that we have this relationship. 1691 01:21:14,858 --> 01:21:16,400 And to solve these kinds of problems, 1692 01:21:16,400 --> 01:21:18,257 I think that's extremely helpful. 1693 01:21:18,257 --> 01:21:20,090 CRYSTAL YANG: Yeah, I'll just follow up also 1694 01:21:20,090 --> 01:21:22,632 on the research question to the excellent question over here. 1695 01:21:22,632 --> 01:21:24,710 I failed to mention, I am actually an economist 1696 01:21:24,710 --> 01:21:25,610 as well as a lawyer. 1697 01:21:25,610 --> 01:21:29,300 And I would welcome many more economists studying this area, 1698 01:21:29,300 --> 01:21:31,640 and I hope that the funding structures as well 1699 01:21:31,640 --> 01:21:34,850 as the incentives do promote that greater collaboration. 1700 01:21:34,850 --> 01:21:37,070 I think the computer science community 1701 01:21:37,070 --> 01:21:38,390 is doing amazing work. 1702 01:21:38,390 --> 01:21:41,210 It's often siloed from what the economics community 1703 01:21:41,210 --> 01:21:43,180 is thinking about, what the legal community 1704 01:21:43,180 --> 01:21:43,930 is thinking about. 1705 01:21:43,930 --> 01:21:47,030 And so I think initiatives like what Dean Nobles is doing 1706 01:21:47,030 --> 01:21:51,110 is probably a really great way of bringing people together. 1707 01:21:51,110 --> 01:21:53,870 And to education, there is such a need 1708 01:21:53,870 --> 01:21:57,890 for infusing this type of learning in legal systems. 1709 01:21:57,890 --> 01:22:00,710 And I don't think that US law schools, at least, 1710 01:22:00,710 --> 01:22:03,080 have really been at the forefront of this. 1711 01:22:03,080 --> 01:22:06,080 In fact, many of the decisions you read from state 1712 01:22:06,080 --> 01:22:09,380 supreme court judges who are ruling on the use of algorithms 1713 01:22:09,380 --> 01:22:13,580 and making important case law have explicit acknowledgments. 1714 01:22:13,580 --> 01:22:17,500 I'm paraphrasing here, but I'm not paraphrasing so far of, 1715 01:22:17,500 --> 01:22:19,797 the judges in this case were limited in their decision 1716 01:22:19,797 --> 01:22:21,380 making, because they didn't understand 1717 01:22:21,380 --> 01:22:22,910 how the algorithm worked. 1718 01:22:22,910 --> 01:22:24,960 Well, that's a really big problem. 1719 01:22:24,960 --> 01:22:27,185 And so we need to train the lawyers who 1720 01:22:27,185 --> 01:22:30,140 will be deciding these cases, working on behalf of clients 1721 01:22:30,140 --> 01:22:32,240 who are both creators of algorithms 1722 01:22:32,240 --> 01:22:34,310 and individuals adversely affected 1723 01:22:34,310 --> 01:22:39,030 by algorithms to understand how algorithms work. 1724 01:22:39,030 --> 01:22:40,310 URS GASSER: [SPEAKING GERMAN] 1725 01:22:40,310 --> 01:22:41,480 1726 01:22:41,480 --> 01:22:44,370 INTERPRETER: Mr. President, you have the final word. 1727 01:22:44,370 --> 01:22:44,870 Thank you. 1728 01:22:44,870 --> 01:22:45,860 Thank you, indeed. 1729 01:22:45,860 --> 01:22:48,500 I'm not attempting, even attempting 1730 01:22:48,500 --> 01:22:51,190 to respond to all the questions that were put to us, 1731 01:22:51,190 --> 01:22:53,780 but allow me begin by the following remark. 1732 01:22:53,780 --> 01:22:57,410 The debate that we have just been witnessing 1733 01:22:57,410 --> 01:23:00,460 with the participation of the audience 1734 01:23:00,460 --> 01:23:03,350 would undoubtedly be easier in the future 1735 01:23:03,350 --> 01:23:05,900 if we were to keep it clear from any misunderstandings. 1736 01:23:05,900 --> 01:23:07,610 If I may come back to the question 1737 01:23:07,610 --> 01:23:10,357 that you put at the beginning, why 1738 01:23:10,357 --> 01:23:15,590 is there no economist amongst people here? 1739 01:23:15,590 --> 01:23:18,230 You know the economic potential of IT 1740 01:23:18,230 --> 01:23:21,980 and artificial intelligence is being seen sufficiently, 1741 01:23:21,980 --> 01:23:22,540 I believe. 1742 01:23:22,540 --> 01:23:26,140 So if you take a look at the expert, 1743 01:23:26,140 --> 01:23:28,220 the group of experts brought together here, 1744 01:23:28,220 --> 01:23:30,620 you will undoubtedly find, confirm 1745 01:23:30,620 --> 01:23:33,020 that everyone here is aware of the economic potential. 1746 01:23:33,020 --> 01:23:35,300 Everyone is aware of the technological potential. 1747 01:23:35,300 --> 01:23:37,490 Everyone is aware of the potential that 1748 01:23:37,490 --> 01:23:41,120 exists when it comes to fighting poverty, fighting 1749 01:23:41,120 --> 01:23:47,158 disease, fighting the impact of climate change. 1750 01:23:47,158 --> 01:23:48,950 If we want to be successful in those areas, 1751 01:23:48,950 --> 01:23:53,300 we need experts at the top level. 1752 01:23:53,300 --> 01:23:56,205 And we in Germany intend to participate in that development 1753 01:23:56,205 --> 01:23:58,220 just as much as you do. 1754 01:23:58,220 --> 01:24:03,750 But that is a kind of advance remark. 1755 01:24:03,750 --> 01:24:05,700 I want to be very clear, having said 1756 01:24:05,700 --> 01:24:09,110 what I've said doesn't mean that we end up 1757 01:24:09,110 --> 01:24:15,080 in an age of unbridled regulation, a crazy approach 1758 01:24:15,080 --> 01:24:16,230 towards-- 1759 01:24:16,230 --> 01:24:18,080 a craze about regulation. 1760 01:24:18,080 --> 01:24:20,210 When you look at the field of tension 1761 01:24:20,210 --> 01:24:25,450 between new technologies and AI on the one hand 1762 01:24:25,450 --> 01:24:27,770 and what is the constituent element of our societies, 1763 01:24:27,770 --> 01:24:30,080 and that is democratic decision making 1764 01:24:30,080 --> 01:24:32,350 processes in Western societies. 1765 01:24:32,350 --> 01:24:34,190 Their is a field of tension. 1766 01:24:34,190 --> 01:24:36,830 Shouldn't we make that also a topic of the discussion 1767 01:24:36,830 --> 01:24:37,750 every once in a while? 1768 01:24:37,750 --> 01:24:40,100 And that is why I suggested to make that the topic 1769 01:24:40,100 --> 01:24:41,710 of our discussion today. 1770 01:24:41,710 --> 01:24:47,180 So no one should assume or be afraid that this inherently 1771 01:24:47,180 --> 01:24:54,490 entails a secret wish to in some way influence the development 1772 01:24:54,490 --> 01:24:58,020 or to slow down the developments in the field of AI 1773 01:24:58,020 --> 01:25:00,410 and technologies of digitization. 1774 01:25:00,410 --> 01:25:02,065 But that's not my intention, really, 1775 01:25:02,065 --> 01:25:03,740 but there is this field of tension I mentioned, 1776 01:25:03,740 --> 01:25:04,940 and we have to focus on it. 1777 01:25:04,940 --> 01:25:06,470 We have to deal with it, and this 1778 01:25:06,470 --> 01:25:08,540 is equally true for all those who 1779 01:25:08,540 --> 01:25:12,710 participate in the process of technological development 1780 01:25:12,710 --> 01:25:15,950 of these means of communication. 1781 01:25:15,950 --> 01:25:19,040 This should not be left to philosophers 1782 01:25:19,040 --> 01:25:20,540 or individual groups. 1783 01:25:20,540 --> 01:25:23,220 It has to be viewed as a topic for all of us. 1784 01:25:23,220 --> 01:25:28,460 And if we pursue such an approach, we will, I believe, 1785 01:25:28,460 --> 01:25:31,142 reach a point, and that has become obvious here 1786 01:25:31,142 --> 01:25:32,600 as a consequence of the discussion, 1787 01:25:32,600 --> 01:25:35,630 where we don't leave it to appealing 1788 01:25:35,630 --> 01:25:38,330 to the morals and each of every individual 1789 01:25:38,330 --> 01:25:41,390 and his or her responsibility. 1790 01:25:41,390 --> 01:25:46,700 But we need to have a debate across borders, whether there 1791 01:25:46,700 --> 01:25:49,340 should be limits to technological process 1792 01:25:49,340 --> 01:25:52,250 that we should not surpass, because this, 1793 01:25:52,250 --> 01:25:55,430 at the end of the day, is what it is all about. 1794 01:25:55,430 --> 01:25:59,150 It's difficult enough when you look at Germany and the United 1795 01:25:59,150 --> 01:26:02,120 States of America, but it will become even more difficult 1796 01:26:02,120 --> 01:26:07,130 when you think about those countries that 1797 01:26:07,130 --> 01:26:14,210 have a completely different social system or approach. 1798 01:26:14,210 --> 01:26:15,950 But we need to have that debate. 1799 01:26:15,950 --> 01:26:18,730 We need to have it with a country like China. 1800 01:26:18,730 --> 01:26:21,950 And in saying that, I'm not cherishing any illusion 1801 01:26:21,950 --> 01:26:27,280 about us having in 10 or five years a kind of UN charter 1802 01:26:27,280 --> 01:26:28,720 on artificial intelligence. 1803 01:26:28,720 --> 01:26:30,240 We won't get that. 1804 01:26:30,240 --> 01:26:32,190 But nevertheless, we should engage 1805 01:26:32,190 --> 01:26:35,990 in that kind of a debate, just as much 1806 01:26:35,990 --> 01:26:39,890 as we have a debate with China although we 1807 01:26:39,890 --> 01:26:42,680 have different views on the issues of bioethics 1808 01:26:42,680 --> 01:26:45,440 and genetic engineering. 1809 01:26:45,440 --> 01:26:47,430 We are not in agreement on these issues, 1810 01:26:47,430 --> 01:26:49,100 but nevertheless we have succeeded 1811 01:26:49,100 --> 01:26:53,660 in defining some limits or ceilings or restrictions. 1812 01:26:53,660 --> 01:26:56,450 1813 01:26:56,450 --> 01:26:59,370 Thus I am not discouraged, you know, in any way, 1814 01:26:59,370 --> 01:27:01,370 when I look to the possibility of such a debate, 1815 01:27:01,370 --> 01:27:05,000 although it's going to be a complicated one. 1816 01:27:05,000 --> 01:27:08,530 But you know, this was mentioned. 1817 01:27:08,530 --> 01:27:12,720 What we need is a transatlantic debate on the subject matter, 1818 01:27:12,720 --> 01:27:13,220 too. 1819 01:27:13,220 --> 01:27:16,240 1820 01:27:16,240 --> 01:27:19,940 Apart from all the topics of the day, the conflicts of the day, 1821 01:27:19,940 --> 01:27:22,930 and I don't want to downplay their importance, 1822 01:27:22,930 --> 01:27:25,435 but we have to tackle the question of the importance 1823 01:27:25,435 --> 01:27:26,810 of the freedom of the individual, 1824 01:27:26,810 --> 01:27:30,290 of the democratic culture in the states of the Western world. 1825 01:27:30,290 --> 01:27:33,290 And we count ourselves amongst those, just as well as 1826 01:27:33,290 --> 01:27:34,790 the United States of America, and we 1827 01:27:34,790 --> 01:27:38,100 need to have that debate amongst the West first and foremost. 1828 01:27:38,100 --> 01:27:41,550 This is what I would wish, us to have the opportunity, time 1829 01:27:41,550 --> 01:27:45,050 and again, as I have been trying to seek it during my visit 1830 01:27:45,050 --> 01:27:49,730 here to engage in discussions and debates that do not solely 1831 01:27:49,730 --> 01:27:53,560 involve focus on the present conflicts, trade 1832 01:27:53,560 --> 01:27:55,220 conflicts being just one case in point, 1833 01:27:55,220 --> 01:27:58,610 but to have a transatlantic dialogue about the issues 1834 01:27:58,610 --> 01:28:02,230 that are really at the essence of what links us and affects us 1835 01:28:02,230 --> 01:28:06,710 in the years to come and will be affecting us in the future too. 1836 01:28:06,710 --> 01:28:08,900 I very much look forward to my next visit 1837 01:28:08,900 --> 01:28:12,640 to Boston and to Harvard, and thank you for having come here. 1838 01:28:12,640 --> 01:28:15,390 [APPLAUSE] 1839 01:28:15,390 --> 01:28:25,781