Name:
danah boyd Closing Keynote NISO Plus 2020
Description:
danah boyd Closing Keynote NISO Plus 2020
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/cb5f0733-cc96-4abc-a02f-121158d4a904/videoscrubberimages/Scrubber_5.jpg
Duration:
T01H09M53S
Embed URL:
https://stream.cadmore.media/player/cb5f0733-cc96-4abc-a02f-121158d4a904
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/cb5f0733-cc96-4abc-a02f-121158d4a904/danah boyd closing keynote NISO Plus 2020.mov?sv=2019-02-02&sr=c&sig=lr97bv3gsVa1Hi8g7LIEfwsYMB33J%2BalEtcdeu532ns%3D&st=2024-11-24T00%3A42%3A09Z&se=2024-11-24T02%3A47%3A09Z&sp=r
Upload Date:
2023-02-13T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
SPEAKER 1: I am incredibly proud to introduce danah boyd and her keynote on questioning the legitimacy of data. danah is a partner-researcher at Microsoft Research. She is the founder and president of data and society, and a visiting professor at New York University. Her research is focused on addressing social and cultural inequities by understanding the relationship between technology and society. Her most recent books, It's Complicated-- The Social Lives of Networked Teens, and Participatory Culture in a Networked Age, examine the intersection of everyday practices and social media.
SPEAKER 1: She is a 2011 Young Global Leader in the World Economic Forum, a member of the Council on Foreign Relations, a director of both Crisis Textline and Social Science Research Council, and a trustee of the National Museum of the American Indian. She received a bachelor's degree in computer science from Brown University, a master's degree from the MIT Media Lab, and a PhD information from the University of California Berkeley.
SPEAKER 1: I will further say that Dana is someone that I personally look up to a great deal, and make sure that I follow her intensely, because she is always ahead of the curve when it comes to understanding information in our world. So I would like to welcome danah. [APPLAUSE]
DANAH BOYD: Hi, everyone. So first, my apologies. I think Jason was being very nice. I am an idiot. This is entirely my fault. I don't know how many of you know New York's Penn Station, but I managed to get on a train going in the wrong direction. So this is entirely a mess-up of my making. And I looked into every way of trying to get down to you, and I couldn't make it work.
DANAH BOYD: So I am sad to not be with you. But I am excited to be able to talk with you today. So the talks I prepared is about questioning the legitimacy of data. And this is a tour through some of my current work. And then we'll have an opportunity for Q&A afterwards. So let's just get started. So, data-- we often talk about data as though its oil, a natural resource that fuels opportunity but might also kind of devastate the planet.
DANAH BOYD: We talk about data with reverence, as though if we only had more data, we would solve all the world's problems. And that hype is really profound, especially as the conversation has been turning more and more to AI in every context. Now, I've often struggled with what do we even mean by AI. I come up from computer science. I have a particular technical definition.
DANAH BOYD: But when I look at all of the ads sitting in front of us at the airport, I know that the conversation about AI in business isn't the conversation about AI that I know. And I was trying to puzzle my way through this such that I asked a tech executive-- I'm like, why is everyone talking about AI? What's the big deal? And he looked at me and he said, ah, we talk about artificial intelligence because we can't talk about natural stupidity.
DANAH BOYD: And I was like, that's a terrifying way of going about this. Like, does this have anything to do with the technologies that I've been talking about? And they're like, no, not really. It's just a set of processes, a set of myths, a set of ideology. And that gets us back to Geoff Bowker's core point, which is that raw data is both an oxymoron and a bad idea.
DANAH BOYD: So the contrary data should be cooked with care. So as we're talking about a AI, we need to really talk about what that data is, how it looks like, where does it come from, what's going on there, and really tease that out in a sensible way so we can start talking about the legitimacy of all the technical systems. Now there's a problem with data, which is that the moment that data has significant power, people try to mess with it, right?
DANAH BOYD: And this has been true for forever. And I love this old economist's saying, which is that "If you torture the data long enough, they will confess to anything." Which is that, as a social scientist, we've long known that data don't speak for themselves. They're used for all sorts of purposes to say certain things. And you really need to understand the systems that they are embedded in order to understand what is going to happen, how that data is going to get contorted.
DANAH BOYD: And the more that we become obsessed with the idea that data means so much, the more that that data becomes vulnerable to all sorts of attacks. And a lot of what we'll talk about today are some of the vulnerabilities that happen that really undo the possibility of data in important ways. Now, before I go there, I want to share a quote that I really love.
DANAH BOYD: Because I was spending a lot of time trying to figure out how to understand data within the business sector, and in particular within the social media and tech industry, where everybody is like, oh, people's data, personal data, et cetera. And this idea that if we just had Facebook data, we could actually understand all of human behavior, for which I sort of scratch my head with horror. But I'm not alone.
DANAH BOYD: Jeff Hammerbacher is the founding data scientist at Facebook. And he left pretty publicly when he decided that he couldn't watch the best minds of his generation spend their time making ads. But in talking to him-- he gave a talk at Data & Society. And people asked him about how to make sense of the data that was happening in Facebook.
DANAH BOYD: And he stated this. He said, "I found that the people who ascribe the most power to statistics and data are not people who do statistics and data science they are executives who give vision talks about the power of data. I've seen so many cringe-inducing assertions. In my head, I'm listening to all these, and I'm like, I remember that conversation.
DANAH BOYD: And the data which that is based on is so utterly flawed and unlikely to be true. But it supports the mythos of the particular executive, so let's just repeat it until it is true." And that mindset is one that we often see, which is the idea that the data become used to achieve certain kind of political messages, to achieve certain kinds of business interests. But that creates a huge problem as we start to care more and more about the data.
DANAH BOYD: Because the data's legitimacy is not simply about the data's quality. The data's legitimacy comes from a belief that we can collectively come together and believe that those data are sound, valid, and fit for use. And this becomes really contested on a regular basis, such that there are times where we actually call data legitimate when its quality is really problematic.
DANAH BOYD: And there are times when the data actually has high quality but we challenge its legitimacy. So I want you to hold both of those in mind as we start to go down to talk about different aspects of how we get to this data ecosystem, and what we need to do once we have it. OK, so on my first day of an introduction to data science, I actually pull up and ask students to spend time working with data.
DANAH BOYD: And I often tend to mess with them. I kind of trick them into doing funny things. And so on my first day of the last class I taught, I asked students who were supposed to come to the class having already built their environment, knowing how to work the system, I ask them to load a file that I put on the screen. And that file was the New York City Stop and Frisk data set. And I asked them, I'm like, pull it in, show me that you can actually read the file into your environment, show me that you can actually work with the material, and tell me what the average age of somebody who's been stopped is.
DANAH BOYD: Hand after hand comes up, and they say 27. And pretty much the whole room has gotten to a point where they're all announcing, 27. And I pause, and I say, so is that accurate? And everyone looks at me like I'm an alien. They're just like, what do you mean, is it accurate? We all got 27. Clearly that's the right answer. And I'm like, well, what does it mean?
DANAH BOYD: And students start to make assumptions. They start to say, well, you know, I would have thought that it was going to be a younger number. But I guess there are a lot of homeless people, and they're older. And what you see is they start projecting a set of social norms and values onto the data. And again, I push back. And I'm like, well, might there be any problems with that data?
DANAH BOYD: Is that data right? And again, students are looking at me like I'm an alien. So I ask them, first, do me a favor. There's two variables in your file. There's a variable for age and there's a variable for date of birth. Do they match? The students come back. And they're, like, no, they don't.
DANAH BOYD: Why don't they match? And then I'm like, would you please run a distribution on the age? And their eyes get wide. And they're like, why are so many people 99? I'm like, why are so many people 99? And of course what we realize is that this data set, this New York City Police Department data set, is rife with problems. There are people that are resident age 0, people arrested age 99.
DANAH BOYD: And that's not necessarily saying that those people are those ages, but that that is what is in the data. And I use that to remind people that the data that we're working with is regularly messy. Data, writ large, as all of you know too well, is filled with all sorts of error. And that error comes from all sorts of natural ways. It comes from our instruments being off.
DANAH BOYD: It comes from these questions of how these data get biased, and who's trying to do what with the data, how the data are taken from one environment to another. And we have to really interrogate those biases, and see how they're playing out. And that's where I keep thinking about how we're building a set of architectures for a data-driven future where we're assuming that the data are higher quality than they often are.
DANAH BOYD: We're assuming that we can de-bias the data. We're assuming that the systems that are designed with good intentions can't possibly go wrong. But part of what my work is about is trying to really interrogate where it does go wrong, where things don't play out as we expect. And so let me start with an example from Latanya Sweeney. She's a computer scientist up at Harvard.
DANAH BOYD: And one day, she was sitting down with a journalist, and she was trying to recall a paper she had written. And so she threw her name into Google, figuring that that would be the best way to find the paper really fast. And the journalist pointed out the adds that came adjacent to her name, and said, why are you getting these ads for these criminal-justice-related products?
DANAH BOYD: And she said, oh, well, that is odd. And as a computer scientist, she decided she was going to run an experiment. She was going to test Google's system. And she took the different baby names that existed, and she looked at the racial categorization of those baby names across history, and she threw them at Google. And what she got back was that there was a specific company that was buying ads on baby names as a whole, and basically saying whether or not they needed background checks, or arrest records, different things, public records, all of these different materials.
DANAH BOYD: There were six different possible ads. But when you search for names that were predominantly black and African-American names, you got criminal justice products. And when you searched for names that were primarily white names, you got things that were more generic background checks. And she thought this was really fascinating. One might immediately assume that this meant that Google must be doing something nefarious in segmenting out baby names.
DANAH BOYD: But that's not actually what goes on, and Latanya knew that. What was happening was that Google gives the option of allocating its resources to a category like names, and then the system works on learning which names are which ads in order to give the client, the advertiser, the best thing possible. Now, what that meant is that when people were searching on Google for baby names that were more associated with black and African-American names, they were more likely to click on the criminal justice products, and vice versa for the white baby names.
DANAH BOYD: In other words, a very racist public taught Google how to be racist on its behalf so that Google then could amplify that to all of us, right. And this is the kind of moment where I don't think any one of those individuals who's clicking on those ads really thought through how their own prejudices were actually shaping their decisions of what to click on in a search, let alone for Google, who's not categorizing these names by their racial histories, was not thinking that, oh, this racist public was going to create this feedback loop.
DANAH BOYD: But yet here we are. And that's the kind of thing that happens with a lot of data-driven systems. And so Virginia Eubanks spent a lot of time trying to understand how these systems perpetuate longstanding practices, how a system that was not really designed to be bias per se will actually get infused with all sorts of prejudice. And what she found is that because people get obsessed with trying to double down on the ways of making the algorithms accurate or making certain that the data is accurate, they end up reinforcing the long systemic histories.
DANAH BOYD: Because most predictive technologies certainly use the past as the anchor to the future. And if the past is extremely problematic, so will the future be. And this is one of the things she argues in Automating Inequality, which is that even with the best set of designs, if we don't account for the way the systemic biases exist within the data, they're going to work their way into the architecture.
DANAH BOYD: Now, how does this actually play out in practice? One of the examples that I think is really critical in this environment is what's going on with the criminal justice and judicial decision-making ecosystem. So as many of you may know, the United States has a very extremely racist history, especially as it connects up to incarceration. And mass incarceration has perpetuated a whole set of longstanding racial prejudices.
DANAH BOYD: And many people are looking at this data, saying, can we de-bias it, can we make this data not be infused with such racial prejudice. And in particular, how do we implement things like predictive policing, where you allocate the resources of law enforcement officers to particular locations depending on where criminal activities are most likely to occur, can we make that less prejudiced.
DANAH BOYD: And can we make the opportunity to get bail less prejudiced. And so people are often starting with the data itself, saying, hey, here is this data set. This is the longstanding histories. Now, the difficulty here is that it's not just what data exists, but it's also what data is missing. So take the predictive policing example or equivalent of this. So take something like drugs. Actually, in the United States, white individuals are more likely to consume and sell drugs than black individuals.
DANAH BOYD: But black individuals are far more likely to be arrested and prosecuted for both the consumption and sale of drugs and as a result, of the data that we have, from arrests to convictions, is really rooted in the data that we collect, which is primarily about black individuals. We don't send police to college campuses to go looking for drugs, even though the drug consumption there is at the highest in any given geographic segment.
DANAH BOYD: And the result is that missing data means that there's no way to really meaningfully, quote, unquote, de-bias the data that we have, because the data we have is so flawed in relationship to the data that is more general. Well, this becomes even more insidious when we think about what happens in terms of the decision-makers. So the idea with the predictive policing system is it is often about the law enforcement officer going to a particular environment and then deciding whether or not to pursue an arrest.
DANAH BOYD: Well, the place we often think is our backstop is the criminal justice system. So the idea is that even if a judge gets a signal, they should be able to recognize the large systemic issues at play, and they should be able to correct it. Now, unfortunately, what we also know is that judges are kind of racist. They've had that history themselves. So when they're the backstop, their biases get in the way.
DANAH BOYD: And they're more likely to actually use those biases in ways that actually cause harm. So this is one of the reasons why people started creating these risk scores, which is that given the individual, what is the risk profile. And therefore, should they be encouraged to be out on bail, and that the judge should really sort of move in that direction. And this is often thought of as, like, well, if you put the human in the loop you can have the best of the data-centric world and the best of human judgment.
DANAH BOYD: Unfortunately, this doesn't work exactly as planned. And so what happens in the risk assessment environment is that judges, who generally will verbally say that they ignore the risk score, they're not paying attention to it, take it into the broader packet. And they say, OK, well, you know, I'm not going to use that. But depending on what their own prior is, they start to shift towards the risk score.
DANAH BOYD: Now why? Let's actually deal with the very human realities of this structure, which is that a judge is elected or a judge has a job for which they could be fired. And they don't want to explain why somebody who was let out on bail when the algorithm told them not to, they don't want to have to show about how they went against that. It's a lot easier to go with the algorithm.
DANAH BOYD: And then you can place the blame back on the algorithm. And it's not, really, of course, even the algorithm, but that's sort of the nice thing we like to say. It's actually the whole way in which that system was designed between the data and the technical architecture. And that dynamic, that sociotechnical dynamic between this, means that a human in the loop is often not there as a backstop, but as somebody that becomes fully integrated in the whole process in a way that actually can end up causing harm back to the human.
DANAH BOYD: So where does this actually play out? So my colleague, Madeleine Elish, was researching the history of autopilot in aviation. And the disturbing reality of airplanes these days is that they're not actually flown by pilots. You are being flown by an autopilot. And you have people that are sitting in that cockpit who are there to babysit machines. And that was really set up in the 1970s, when, during a set of congressional hearings, it was decided that the navigator was no longer needed, so that person could be fired, and no longer have a job.
DANAH BOYD: But there needed to be a pilot and co-pilot on every plane, just in case the machine went wrong. Now, what does this mean in practice? This means that a human who's been babysitting this machine for a long time has not actually been practicing on the job. So they've been systematically deskilled on the job. And they're supposed to step in when all goes wrong. They're supposed to step in and say, hey, there's a problem here.
DANAH BOYD: And they're supposed to be able to fix it. Now, unfortunately, when things are really going wrong with the system, it's not so easy for a human to jump in and fix it, even if they're quite skilled and if they've had a lot of practice. And so what ends up happening is that because the human is the last person to touch the machine, it is almost always declared as an issue of human error rather than an issue of technical error, even though a lot of where these things begin is often due to technical error.
DANAH BOYD: Now, people like to point out, like, ah, but the landing on the Hudson, [? Scully, ?] he managed to make this work. And I would like to point out who [? Scully ?] was. He was the person-- or is, but he's no longer working in the profession. But when he was working in the profession, he was the one who trained people on weekends on how to re-fly during emergencies.
DANAH BOYD: He was probably the most skilled person for landing on the Hudson possible. And that is not like a normal state. Most people are not actually given that opportunity. Now, what [? Madeline ?] calls this, she argues that these humans end up actually being the liability sponge. They absorb all the liability of the technical system. And what she argues is that this creates a moral crumple zone, that the system is set up to absorb all of the pain on impact.
DANAH BOYD: But it's not the actual technical system that absorbs that, it's the human. The human is stuck in the middle. So if you think of the crumple zone as the part of your car that absorbs on impact, this is the human getting caught up in that way. Now we, of course, saw parts of this on full display with Boeing. Now the funny thing about Boeing is the people are like, ah, well, this is all a complaint about the pilots not being able to step in.
DANAH BOYD: And indeed that's true. , This is a moment where the systems were failing to meet the standards that had been set up for them. They were not evaluated and tested. Because of course we've declined a lot of our testing protocols. And these systems were not prepared. But I would like to point out that certainly, of the two cases that are officially the Boeing airplane, they were first initially blamed as human error.
DANAH BOYD: And it turns out it looks like there's a third that happened even before that was still based on human error. So it's interesting to think about how many people have to die before we have to go, maybe it's not the humans that are the problem. And this is where I want to think about what it means to put that human in the loop, not as a solution, but maybe part of the problem because they don't actually have the power you think they do.
DANAH BOYD: In many ways, data doesn't have the power. So where is that power, and what are these data? And I spent a lot of time looking at different kinds of data, but for the purpose of this talk, I want to talk about the data that really relates to humans, so people, and their practices, and their activities. So this is the data that we think about with social media or search engines.
DANAH BOYD: This is the telephone companies. This is the data that tells us something about people. Now, for the most part, we find that the data ends up in the world of data science through either choice, circumstance, or coercion. So let's unpack what those mean. For most people in the industry, they like to think that all data arise by choice, which is the idea that if I share with you, you'll share with me, and this will be-- we'll get a good relationship back and forth.
DANAH BOYD: And it's funny, because I think about this in the early days of Fitbit. People were so excited back when Fitbit came out, which was like, I'm going to share my steps, and I'm going to share them with my family through the interface. And I know that Fitbit is going to have this, but they're going to actually provide a service back to me. And this is going to be awesome.
DANAH BOYD: But the reason why I think Fitbit is also a good example is that it didn't stay this way. It started out as a question of choice, which is that people gave it to them willingly and excitedly. And then it became integrated into all of these work systems, as part of insurance. And now people of course are putting their Fitbit on their dogs to get the step count up so that they can keep their insurance low.
DANAH BOYD: Now, on the flip end of this is something like data by coercion. And this is of course where data is extracted from you without your choice. And I often like the example of the spit and acquit program in California. So the Supreme Court, in a case called Maryland versus King, ruled that collecting the DNA sample of people during an arrest was equivalent to collecting a fingerprint or a photograph.
DANAH BOYD: This would be a unique identifier. This is a painful thing for the court to have argued, because they seem to have missed ninth grade biology. Because this, of course, is not an individual identifier. This is an identifier of an entire network, all of the people that you're related to. So what ended up happening is that people get pulled over for various driving issues or other criminal activities, and they're required to spit, and provide a sample to law enforcement in exchange for not getting arrested.
DANAH BOYD: Now, that is quite coercive. But where it's even more coercive is the way in which, then, law enforcement uses this to model out relations of people that they're curious, and try to find out. Now imagine what it was it would be like for an LAPD officer to knock on your door asking you about your brother who is wanted for arrest, only for you to not realize you had a brother.
DANAH BOYD: That would be awkward. And this is where we start to see the ripples of the coercion. And that's one of the things to realize, is that the coercion doesn't just do harm to the individual, but the caution does harm to the ecosystem as a whole. And we see that as a longstanding issue. one of the reasons why black individuals do not participate in a lot of medical studies is because the long history of very coercive data practices.
DANAH BOYD: So the ripples are over time and over communities, and result in all these other unintended consequences. Now, most data that we work with ends up in analysts' hands because it's data by circumstance. It's handed over because you want to be on Instagram with your friends. And so you hope that Facebook won't mess with you. And you hand it over. And you really just don't want to think about it any further.
DANAH BOYD: And this is often said, ah, well, people don't care about privacy. It's like, actually, people just don't want to think about these issues. They don't want to think about all of the unintended consequences. They're hoping it will all work out, and that the structure is in place for it all work out. And of course this forces us to then think about where the data uses come into play.
DANAH BOYD: And that's where we start to see the difference in really what matters, and how we evaluate those data. So let's talk about a positive context. Most people really love the idea that we could cure cancer, and the idea that we could build more data analytics, engage in precision medicine, advance science in coherent and strong ways. That is fantastic, and that is super important. Now, of course, even in this context, of course, we recognize the history of where people have had biases.
DANAH BOYD: We run into certain problems. People don't trust the system. And my colleague, could you Kadija Ferryman was examining a moment like this where there was a notice that was sent out to everybody that said that a particular biomarker was associated with a specific heart disease. And as part of the screening, people had these biomarkers. You should sort of flag them for this heart disease.
DANAH BOYD: And this went out across all of New York. And then, all of a sudden, doctors were saying, wait a minute, it seems as though everyone who has this biomarker is black. What's going on here? And it turns out that the studies that were related to that biomarker and that heart disease had never actually accounted for who was in their sample and who wasn't. And so this meant that they ended up getting inaccurate science because of these other systemic issues with regard to data bias.
DANAH BOYD: Now, this is still in an area where people really are trying to do their best and trying to make a difference. Let's switch that around into areas where things get a little bit more sketchy. So think about something like scheduling software. For those who aren't familiar with this, this is that every-- retail worker in a place like Gap, they are going to get their schedules every week based on a set of factors that are built through a particular piece of scheduling software.
DANAH BOYD: This is true for ERs, this is true for retail, for a lot of different minimum wage jobs. Now, I could design for you a system that would really maximize all of the opportunities to really make workers happy, to make certain that they're working with the people that they like the most, that you optimize for the shifts that they need for child care and other such things, that you make certain that they get the number of hours and with the kind of schedule that they want.
DANAH BOYD: This is a thing that you can actually work, where of course you can talk about how somebody is still going to have to do the night shift. And you have to do these trade-offs, et cetera. But that's actually not what's incentivized in the system. Because of course, who's paying for these systems? The prices paid for them is the person who is employing them, the managers. And what the managers are most incentivized by is not simply about making certain for worker happiness.
DANAH BOYD: They're incentivized to make certain that all shifts are covered. And that means checking into it last-minute as possible. They're motivated to make certain that workers do not work with the same people over and over again so that they can't unionize. They're incentivized to make certain that workers don't have stable schedules so that they can be on call, and don't take other jobs for last-minute.
DANAH BOYD: They're incentivized to make certain that workers only work 32 hours, so that they don't have to pay benefits, et cetera. Those are not a problem of the system itself. The design of the system is shaped by these factors that go far beyond the technical work, far beyond the data. And this is where it becomes really tricky because we have to account for how the data is going to be used, and whether or not we value those uses.
DANAH BOYD: And this, of course, is an "employment in a private sector" context. But let's shift it to a public-sector context for a moment. We are about to embark on the 2020 census. And this is not going to come in the same form as it came 100 years ago. But this process of convincing the public that they should participate in the census is a really critical activity.
DANAH BOYD: Why? Because of course this is how we allocate our representatives. This is how we allocate funding. So our apportionment and our redistricting is dependent on this. All of our justice cases, trying to combat [? disinformation ?] are dependent on these data. All of social science research is dependent on this data. But where do you negotiate with the public?
DANAH BOYD: They're required to fill this out by law. At the same time, you don't really want to engage in a coercive act. You really want to get people to participate voluntarily. And this is a really hard challenge, because not everybody is completely wedded to making sure we get a good count. And as you will inevitably see in about a month from now, we will be having a lot of fights about what it means to do a census, what it means to count all of the people, who should count, how they should be counted, what it means to guarantee participation, et cetera.
DANAH BOYD: And that's where we start to see all of these things interconnect. So how does this fall out on the flip side? Let's start talking about where it can come undone. And I think about this as data infrastructure. Because a lot of these systems turn into not just data for themselves, but data infrastructures. And the census is a case in point, which is that we need this piece of infrastructure to work so that we can actually manage our democracy and all of these other things.
DANAH BOYD: Well, when the data has that much power, people try to mess with the data, and they always have. In the case of the census, they've been messing with the data since 1790 in all sorts of ways. And you really have to think about how to secure or stabilize that data infrastructure. And this is a problem in this whole new world of AI, which is that we're seeing tons of vulnerable data infrastructures.
DANAH BOYD: And so I'm going to give you an example of one of those vulnerabilities, a kind of example that sits between the social and the technical in a critical way. And this is work that Michael Golebiewski and I have been doing on something called data voids. Now, for anybody who thinks that this word sounds a lot like my name, it does. And you can blame Michael for that. This was not my idea.
DANAH BOYD: But data voids, nonetheless, are the thing that we refer to when saying that the search query space-- so if you're dealing with a search engine and you're running a search, there are certain terms for which there is limited data. If you search for a term like basketball, you are going to get tons and tons of data made available to you, because there's so many web pages and other content that is about basketball.
DANAH BOYD: But if you search for something that's much more esoteric, there is far fewer responses. And this is where it creates a certain vulnerability, where people can start to exploit that, and try to shape search results in a way that actually invites people to much more nefarious environments. And this is a world of media manipulation. And I'm going to talk through some of the kinds of exploits that we've seen of data voids and the harms that they've caused.
DANAH BOYD: So to begin with, I want to talk about breaking news. So when an incident first happens, people often rush to a search engine trying to get more information. So consider what happened on November 5, 2017. It was a Sunday. And you might have been drinking coffee or hanging out with your family, and your phone would have alerted you to that there was an active shooting in Sutherland Springs, Texas.
DANAH BOYD: And if you were like me, you've probably never heard of Sutherland Springs, Texas. In fact, people mostly had never heard of it, because we looked at the search query space, and it looked as though zero people search for Sutherland Springs in the previous year. And indeed, at that moment, if you would have searched, what you would have gotten is automatically-generated pages from Zillow, census data, Wikipedia entries, these kinds of material that were automatically generated, or stubs waiting to be filled out.
DANAH BOYD: Why? Not that many people lived in Sutherland Springs. And nobody paid attention to it as a site of any piece of news. But in this breaking news moment, it was the only thing to search for. Now, that meant that it became a site of attack. It became a vulnerability. And what we saw was that at the time there were all of these people who took on the label of alt-right, and often far right, and they wanted to make certain that this particular shooting was associated with who they imagined to be their arch-nemesis, Antifa.
DANAH BOYD: And they really wanted to pin the shooting on Antifa. So they got to work. They started writing things on Reddit, on Twitter, hoping that this would affect Google. And it did. But one of the things that happened is that, in their process, they were also trying to get journalists to cover things.
DANAH BOYD: They were reaching out to journalists pretending to be just curious citizens, being like, hey, might Sutherland Springs have to do with Antifa? So much so that a journalist at Newsweek decided that he would cover this story. And it was one of the first stories at the gate before we had any meaningful information, really, about what was going on in Texas. And this story did a decent job of talking about this attempt to manipulate the information environment.
DANAH BOYD: But there's a problem with his headline, "Antifa Responsible for Sutherland Springs Murders According to Far-right Media," which is that when you search on Google for something like Sutherland Springs, Google returns news headings at the top. So that would be really a positive thing. But they cut long headlines. And that meant that for the first 24 hours, you would search for anything related to the shooting, and what you would get was "Antifa Responsible for Sutherland Springs Murders." So even in this attempt of reporting, Michael Hayden ended up reinforcing the goals of a group of far-right actors.
DANAH BOYD: Now, a second kind of exploit has to do with strategic new terms. And there's an interesting, long history to this. So I don't know how many of you are familiar with Frank Luntz, but Luntz was known in the 1990s for being able to come up with pithy phrases, terms that would get everybody to imagine something in a particular way. And you would know many of his terms right because they in turn include things like climate change, or partial-birth abortion, or death tax, et cetera.
DANAH BOYD: Now, when he was working, in the '90s, what he did was he would make certain that that new phrase would get into the hands of journalists. And the way he would do it was to make certain that members of Congress were repeating it over, and over, and over again until journalists started covering that as the phrase. And that was a way of amplifying the phrase. Well, what's happened in an internet ecosystem 20 years on is that we are seeing people again try to come up with these pithy phrases.
DANAH BOYD: But rather than immediately going to journalists and trying to get the journalists to cover the phrase, they spend a lot of time trying to make certain that there's a lot of content available related to that phrase before it's given to the journalist. And then they try to get the journalists to adopt that phrase so that people will then go back to search engines and search for the phrase. And so to put this in practice, I want to talk about how a particular conspiracy theory sort of unfolded.
DANAH BOYD: So the term, crisis actor, emerged within a group of conspiracy theory forums after the Sandy Hook shooting in Connecticut, which was the shooting of 5-year-olds in a kindergarten. And the conspiracy idea was that Sandy Hook didn't actually happen, but it was actually created by the deep state in order to take your guns away. And as manufacturing of this conspiracy, they labeled the people that went onto national TV talking about this as crisis actors.
DANAH BOYD: And they didn't really get far with regard to Sandy Hook, but they started to produce tons and tons of content related to this conspiracy. And then with each new shooting, they would work on trying to get it in the hands of the journalists. And they finally succeeded at the Parkland shooting, when David Hogg was asked by Anderson Cooper, live on CNN, whether or not he was a crisis actor.
DANAH BOYD: And this was extraordinarily successful at getting this concept into the mainstream. And indeed the spike that you're seeing on the right is about queries during Parkland around this term. And it brought a lot of people into this conspiracy and into the rabbit hole, starting to imagine what they were saying was true. Now, a third type of vulnerability has to do with terms that are outdated, which is that if people are not producing new content related to a term, people can come in and try to produce stuff that is much more problematic.
DANAH BOYD: And they can do it in multiple ways. One, they can produce content that itself is problematic. Or the second is that they can actually take quasi-OK content, and write within the comments to try to get people to go to much more problematic content. And indeed we see this a lot of times with terms that are outdated. A fourth time has to do with fragmented terms, which is that search engines are structured so that they're looking for terms within the content that matches your query.
DANAH BOYD: So if your query actually sends people down different paths, you end up in totally different content spaces. So this is a slightly outdated example, but if you search for "Vatican pedophiles" on YouTube, you get entirely different content than if you search for "Vatican sexual abuse." So rather than resolving these, search engines actually fork you. And you end up with worlds that are totally separate.
DANAH BOYD: And this of course is encouraged by people that are saying to Google certain things and to actually shape the phrases. And this is one of the things we regularly see within white nationalists and other racist environments. In fact, all over New York City, I regularly run into things telling me to Google certain phrases. And if you listen to talk radio, you'll often hear this as well. So the idea is that if you just find the term in a particular way, you can make certain that you get to the right section of the query.
DANAH BOYD: And of course another layer to this are a set of problematic queries. So the queries that have problematic results, a lot of problematic results, in part, because that's the only people producing content. So for example, "did the Holocaust happen" was a deeply problematic query for a long time. And to try to address that, the Anti-Defamation League started producing content trying to fill those holes.
DANAH BOYD: Well, another one that is constantly a problem is "how to get laid," because there are not many sex-positive sites that are producing content around this. This becomes a pretty horrific space pretty fast. Now, if you want to understand what the ramifications of this are, I want you to think about a particular case from June 17, 2015. And that was when a young white nationalist entered the Charleston Emanuel church, and sat down for over an hour with a group of black churchgoers before opening fire, murdering nine.
DANAH BOYD: And when this would all come out, his manifesto would become public. And what we would learn is that he actually was searching, two years before, for Trayvon Martin, trying to figure out what that was about, ended up at Wikipedia, came out of Wikipedia, which he believed to be a neutral source, believing that George Zimmerman was in the right. He then came across a particular phrase in the Wikipedia entry, a phrase that was purposefully placed there for people to trip into and to enable search, and that's the phrase of black-on-white crime.
DANAH BOYD: He then searched that, and he ended up in a world of white nationalism. And he spent two years becoming more and more indoctrinated towards a set of hateful ideologies before he would go and terrorize this community. And I say this because this is one of the few cases that we've had where we have full documentation of his pathway through this, that we can both see in his manifesto, and it turns out that there's a lot of data showing that this is true.
DANAH BOYD: So these things become a real serious vulnerability, a way in which our systems can be manipulated by altering the data ecosystem in order to achieve specifically nefarious purposes. And so how do we think about some of those nefarious purposes? And I want to start off with a concept that I really love, and it's the concept of agnotology. And there's a book by Proctor and Schiebinger that really does a good job-- from the '90s-- laying this out.
DANAH BOYD: And this term sounds strange. You may not be familiar to it. But it's basically the study of the production of ignorance. And let's parse that for a second, which is that we often think that knowledge production, and the study of it, which is epistemology, we often think that the way we know things is something of an ever-learning and progressive activity, which is that you just keep learning. But in fact ignorance is not simply not yet knowing.
DANAH BOYD: In fact, it can often be something that's more like the strategic undoing of knowledge. And one of the things that we see in a world of media manipulation is this ongoing effort to undo knowledge, to destabilize information, to make it so that we don't know what we trust. And I want to talk about that in this section of really looking at how our information ecosystems come undone, and the role that technology plays in that.
DANAH BOYD: And I want to pull us back to 2017, when Cory Doctorow was trying to make sense of the alternative facts conversation. And he made an argument that really, to this day, stands out for me as highlighting this tension, this hyperpolarized environment that we're living in our current society. And he says we're not disagreeing about facts, we're disagreeing about epistemology, the way we know what we know.
DANAH BOYD: The establishment version of epistemology is that we use evidence to arrive at the truth, vetted by independent verification, but trust us when we tell you that it's all been independently verified by people who are properly skeptical, and not the bosom buddies of the people they were supposed to be fact-checking. The alternative facts epistemological method goes like this.
DANAH BOYD: The so-called independent experts who are supposed to be verifying the so-called expert evidence-based truth were actually in bed with the people they were supposed to be fact-checking. So in the end, it's all a matter of faith then. You either have faith that their experts are being truthful or you have faith that we are. Ask your gut what feels more truthful. That is a beautiful way of undoing knowledge, getting to a point where it's no longer about trying to come together and build a coherent way of knowing, but to try to purposefully fragment knowledge, fragment communities, fragment the idea of expertise and trust.
DANAH BOYD: And this is something that we've seen as a core to propaganda for quite some time. So let's look at Russia Today. Russia Today, or RT, is a Russian-backed TV and news station, including a website. And one of their campaigns, one of their sort of more notable campaigns, which was called the Question More campaign, was designed to try to undo knowledge by working within the kinds of frames that we would usually think of as media literacy.
DANAH BOYD: So consider this ad on the left. And there was a whole slew of them. The ad on the left [? says ?] "Is climate change more science fiction than science fact?" And the subtext here is, "Just how reliable is the evidence that suggests that human activity impacts on climate change? The answer isn't always clear-cut, but it's only possible to make a balanced judgment if you are better informed.
DANAH BOYD: By challenging the accepted view, we reveal the side of the news that you wouldn't normally see. Because we believe that the more you question, the more you know." Now, if this was not about climate change, and about the production of evidence for a scientific consensus, we might think that this kind of question, of learning more in order to question and ask-- and really interrogate what's going on makes sense.
DANAH BOYD: But when it's positioned in this way, it's designed to help undo knowledge. And there was a ton of these different kinds of ads, so much so that London outlawed them. They said that they were not going to be allowed to be placed on the tube and other public environments. And the result is that RT came back and put up other ads basically declaring that they'd been censored.
DANAH BOYD: And in the process of saying that they'd been censored, they basically enticed people to come back and learn more about their process. This is a way in which you can reel people in-- not everybody, a certain number of people-- to try to undo the knowledge that you can see around us. So where are we seeing this play out over and over again? I was tempted to actually switch into the coronavirus component of it, because this is all playing out real-time.
DANAH BOYD: But in some ways, because it's so real-time, we don't actually have as much data as we do in an environment like anti-vaxxing. So anti-vaxxing actually shows us some of the worst parts of what's called the boomerang effect. So as many of you know, the conspiracy is that vaccinations are given by governments, and that they have all of these horrible consequences, and that they're there in order to basically help control the population in all these different ways.
DANAH BOYD: That's sort of the extreme end of the conspiracy back in, and then you get some conversations about the idea that they're hiding the evidence, that vaccines have a relationship to autism. And it doesn't matter that there's been scientific consensus saying that there is no correlation between autism and vaccination. It doesn't matter that the original paper that supposedly proved this was shown to be falsified.
DANAH BOYD: What we're at right now is that a huge chunk of the population, an unacceptable amount, believes that vaccinations are dangerous. And what's more disturbing is that the more that the Center for Disease Control or the news media puts out stories trying to calm the public and say there is no correlation, vaccines are safe, et cetera, et cetera, the more you get what's called a boomerang effect.
DANAH BOYD: So rather than actually coming to believe the scientific information, people believe, because they don't trust the news media and they don't trust the government, they believe that there has to be something to this conspiracy, and they want to go and self investigate. And that's a really disturbing place to be in as people are trying to make sense of the environment or the information landscape around them.
DANAH BOYD: Because rather than being able to build scientific consensus, we end up with these environments where we don't actually know how to deal with people who are building conspiracies alongside knowledge production. And that is a real, real problem in a public health context, and one that we are grappling with full-on right now. So my colleague Francesca Tripodi really wanted to try to understand what was going on here-- can we explain some of these different interpretations of the information landscape?
DANAH BOYD: And so she ended up focusing on one particular community of Christian evangelicals, mostly Conservative, living in Virginia. And she wanted to understand how they were making sense of the information landscape, and how that may differ from how these systems were designed. And she ended up really looking at it like what was happening in a Google space. And one of the pieces of her study was when she was sitting at a Bible study.
DANAH BOYD: And she was going through what she was very familiar, of a typical process of choosing a particular scriptural passage, and then going through a thick interpretation of that passage, and discussing what it meant in people's lives. But after spending an hour doing this with this particular text, the pastor then turned everybody to the tax reform bill-- all right, and that dates this-- and said, now let's actually analyze the tax reform bill using the same skills.
DANAH BOYD: And this should be really startling to people. Because it's like, how do you understand a tax reform bill which was not written with the same sort of mindset as a biblical text? It was written with a very different set of understandings of what that writing was for. What does it mean to then interpret it during scriptural inference? And what she found was that this process of scriptural inference was actually being applied to a variety of information landscapes.
DANAH BOYD: And nowhere was this clearer than what she was seeing with regard to Google. And so when she was doing some of her fieldwork, it was in the middle of the Virginia primaries. And primaries are really interesting, because regardless of your political affiliation, you end up trying to make sense of a variety of different candidates in order to vote in a primary. Because if you are a Republican, you're going to be choosing amongst all of the Republicans.
DANAH BOYD: But how do you get information? So she started interviewing all these people, saying, what are you doing? How are you getting information? And they were consistently telling her that they didn't trust fake news, by which they meant CNN. So she's like, OK, so what? And she was like, ah, well, we go to Google. And she was like, OK, sure, so then what do you do.
DANAH BOYD: And she was like, well, we go to Google. And what she realized is that people were searching on Google for what they saw as the right search query, often given to them by talk radio, but also just even in this context, just individual names. And then they were treating Google as though it was going to give all of the information and it would give them both sides, just like Wikipedia is going to give them both sides.
DANAH BOYD: So rather than going and clicking onto any of these particular pieces of content, they were trying to interpret with the same scriptural inference process, and make sense of the Google layout. Now, of course, when Francesca explained this to Google, everyone there nearly had a heart attack. Because of that is not part of their design mechanism. They can't imagine that the decisions of what they're pulling would be interpreted in such heavy fashion.
DANAH BOYD: But indeed that's what she was saying. So I say all of this because we're in this moment where, as we talk about building and standardizing knowledge, and really thinking about how we know what we know, we're also in an environment where there's these different structures that are actually pulling people apart, pulling people to a point where they don't necessarily know what they know.
DANAH BOYD: All right, so how then do we move forward with this? I'm convinced that we're not going to fix the mess we're in, certainly not in 2020, by fact-checking news, or regulating social media, or improving the data and de-biasing it. These are all perfectly reasonable interventions that serve in many ways as Band-Aids. Because they fail to recognize that we're actually in this environment that is much more structurally problematic, where information, and trust, where the ideas of what it means to know something, are coming undone in all sorts of critical ways.
DANAH BOYD: And so we need to actually see this as a sociotechnical problem, which is to say that the issues that are at play in this environment are not technical or social. They're the combination of the two. And in figuring out how to bring these things together, we need to start with the assumption that there will be adversaries trying to attack the systems, and we need to build a whole set of social fabrics that actually can make a difference in building out these networks.
DANAH BOYD: And I think about this in terms of some of the interesting histories of the internet. So I often think about my early days online, where as a young person coming up in the days of Usenet, I found all sorts of strangers that were able to help me make sense of the world. And they were really there holding me, to help me figure out what was going on around me.
DANAH BOYD: And I'm forever grateful of the information that I learned from them. What I've learned today is that there are no longer people that are really thinking about how to hold people online. And those who are doing so are usually doing it for much more nefarious purposes. The result is that, for a lot of young people, they go online trying to make sense of the world around them, trying to find people that can hold them, and support them, and help make sense of things.
DANAH BOYD: And they're running into a cacophony of chaos that they're not sure how to make sense of. And that's really disturbing, because that means that we see this huge vulnerability, where these people, as they're trying to make sense of the information, can easily come across and be pulled in by much more nefarious content. And indeed, if we look at the young white nationalist in Charleston, part of what's most devastating about his radicalization is that it was done online by people who were trying to hold him, who gave him a framework, that we should be disturbed by.
DANAH BOYD: So the second key thing that I want to point out in addition to this is that we need to think about how we build social networks. We've built these things out of our information networks. And they're being made vulnerable. But the most important thing is that we're no longer just having to deal with the kinds of things that have a data void, or search engine optimization, or other ways in which we've seen manipulation of networks of information.
DANAH BOYD: We are now seeing the same manipulation of networks of people. And this is a really deeply problematic part of the transformation, which is that we are seeing a segmentation of networks in ways that actually are polarizing, and where you don't have bridges. And nowhere do you see this clearer than looking at institutional levels.
DANAH BOYD: So I often like to think that George Washington, when the Constitutional Convention was underway, he didn't speak. He didn't talk. He don't think he should. Except for one thing-- he argued that it would be dangerous for the nascent democracy for the House of Representatives to represent 40,000 people. He said that they could maximally represent 30,000 people, because otherwise they wouldn't know their constituents, and they couldn't hold together the nation country.
DANAH BOYD: And of course we're now at a point where it's 750,000 people, and people don't trust government. They don't know government. They don't know a journalist. They don't know somebody working in tech. And if you don't know people in these professions, you don't trust the professions. And so the more that our network becomes really brittle and fragile, the more we run into all of these different problems.
DANAH BOYD: So as we turn to Q&A, I think I want to leave you with the idea that we're facing more and more sociotechnical challenges. And those sociotechnical challenges are because of these great technical systems that are now in our mediating all sorts of human interactivity. But if we're going to address them, if we're going to those vulnerabilities, we need to bring back some of the best things from security, and not just assume that we can rid ourselves of bias, but that we actually have to understand that these systems are under attack, just as democracy is under attack.
DANAH BOYD: And if I think back to sort of our founding fathers' narratives, the idea is not that we would ever have a perfect union, but that we always should always strive towards a more perfect union. And nowhere is that more clear right now than when you look at technology, which is that we can't assume that the status quo is at all acceptable. We need to figure out how we can move towards a better and more sophisticated technical world.
DANAH BOYD: Thank you very much.
JASON GRIFFEY: All right. Hi, danah, this is Jason.
DANAH BOYD: Hi, guys.
JASON GRIFFEY: All right, you're live on screen now. So we can see you. So be aware. Thank you for such a fantastic talk. I'm going to be quiet, and ask if there are questions. Anybody have a question they want to ask danah? Over here. All right. Introduce yourself.
AUDIENCE: Hello, my name's [? Athena ?] [? Heppner. ?] I'm the discovery services librarian at the University of Central Florida. But that's not what my question is about. It was a fantastic presentation-- very scary. And as I was watching it, I was thinking, oh, I'd love to show this to my family, who are mostly pretty conservative. And I was thinking of what their response would be, which is to immediately dismiss it, because it attacks some of the positions or parties that they align with.
AUDIENCE: And I'm sure you must see that a lot. So do you have any thoughts on how you present this in a broader sense, in a way that people don't just shut down if they don't agree with the politics of it.
DANAH BOYD: So I think one of the biggest challenges is that a presentation never gets you the same place that a conversation does. And this is one of the things that we've always seen with regard to politics, which is that you build bridges, really, one on one. And that's actually one of the reasons why I'm most concerned about the state of our media ecosystem. Because we keep thinking that we can amplify, that we can make these bigger, that we can tell stories at scale.
DANAH BOYD: But actually what really makes a difference in people's lives is really being able to make those connections as human connections. And now with that in mind, a lot of what I'm trying to say-- I mean, part of it is that I start out, unquestionably, as a scholar, from a commitment to empirical knowledge. And so this is where it's really tricky. Because what I'm challenging, in many ways, is a different way of producing knowledge and understanding, and one that's rooted in an experience, for example, of the media being evil.
DANAH BOYD: And I get it. And as somebody who knows way too many journalists, I get to see them make mistakes. But part of it is that because I know them, they're human to me. And that's a very different experience than if you don't know them. And so that's one of the reasons why, from my vantage point, in addressing a large amount of these systemic issues, I don't know that I'm going to be able to convince an individual.
DANAH BOYD: The question is, how do we start building a systemic intervention to the systemic problem. And for me, that's actually one of the reasons why I see some of the more powerful interventions in finding ways to build connective tissue. So let me give that in a concrete example. In the 1960s, one of the things that was really interesting was around how we got to the Civil Rights movement. How did we get to a point where we could have a meaningful conversation about race in the United States in a very polarized sense.
DANAH BOYD: And most people talk about the fabulous leaders who came out and spoke loudly. And they were critical, because they gave a narrative. But there was actually an institution that played a critical role that people don't realize. And that would be the military. So what did the military do? The military, during World War II, actually spent a lot of time figuring out how to bring people together across the country, regardless of politics, regardless of race, in order to serve alongside one another at a point where they really needed them to.
DANAH BOYD: And so what did that do? That meant that you had to learn how to lay down your life beside somebody who you didn't really like at first. And you needed to learn to trust somebody to defend your life who you weren't sure shared your politics. And so what you saw coming out of people who really spent those war years, but also in the following decade, which is actually probably more important in this process-- from the military, was that they would come up being like, I don't like X people-- people from Tennessee, people from this particular racial demographic, et cetera, I don't like those people-- except that one guy that I served with, and I really liked him.
DANAH BOYD: And I point this out because the military ended up serving as one of the biggest social fabric creators in the 20th century of the United States. And this was actually true in different ways throughout the 20th century, but also came undone. And it came undone in part because of privatization of the military, where the privatized commitments were not the same. It was really more about efficiency and optimization rather than the large social project that the military is engaged on.
DANAH BOYD: So this is where I think that there's fabulous conversations to be had with your family, with your friends, trying to challenge and see where your disagreements are. And some of those disagreements are far more nuanced than a lot of people realize. But I think that the bigger challenge here is how do we make a real systemic change that can allow us to build those social fabrics so that we can actually learn to hear and recognize one another again.
DANAH BOYD: So that, to me, is going to be, I think, more effective than me being able to give talks across those lines. I hope that helps.
JASON GRIFFEY: Yeah, that was great. That was a great answer. Anyone else? I'm running the mic here. Give me one second.
DANAH BOYD: And my apologies for not being able to see you. So I hope-- hi, everybody.
JASON GRIFFEY: [CHUCKLES] All right. So again, introduce yourself, please.
AUDIENCE: Hi, danah. Thank you for that great talk. My name is Bohyun Kim. I'm from the University of Rhode Island. So I was very impressed and really fascinated by your discussion about agnotology. That was a completely new term to me. And it kind of struck me in a way that I studied the classical philosophy when I was in grad school.
AUDIENCE: So it kind of reminded me of what happened in ancient Greece when Socrates was fighting with sophists for making all sorts of absurd claims. And then Socrates actually had a huge following among the youth. He was actually tried for his bad influence on corrupting the youth in Athens. So I was wondering, in spite of all the current depressing examples, since you also study the youth, do you see any seed for hope in social movement that we see a lot in our society also against this?
AUDIENCE: And particularly in younger generation, do you see any way in which we might make a crack?
DANAH BOYD: Right. So I think it's important to recognize that what we're seeing in terms of polarization and confusion in our US political ecosystem and our ways of knowing is unfortunately happening at every generation. And I want to point this out because take something like white nationalism. Unfortunately we are seeing a rise of white nationalism amongst young people. That is not actually a huge number, per se, but they have undue voice.
DANAH BOYD: And in having undue voice, because they're amplified by news media, because they know how to manipulate different kinds of social media, they're able to slowly recruit certain people. And that's a real challenging thing. Because though you have a small percentage of people, they can actually cause huge damage. And they can of course cause tremendous violence, which is one of the things we're also seeing.
DANAH BOYD: Now, I say that because most young people-- again, you have a small fraction, just like you've always had, who are super activists. And if there's anything that's the center of that youthful activism, it's really coming in the form of climate change. And you're really starting to see that bubble up, and trying to make a statement and stance on this.
DANAH BOYD: And you're also of course seeing some of the more identity-related issues that have always been part of youthful movements. But then we have to deal with the fact that the majority of young people, and in fact the majority of Americans, are not particularly engaged politically or ideologically in one way or another. And that, again, is not new.
DANAH BOYD: I think a lot of people want more people to be engaged on these issues than not. And that's one of the reasons why I think it's important to think about the polluting of the information landscape. Because in many ways it's not about the pollution for the people that are most radical or the people who can be really pushing out and challenging it. It's about all of the folks who are just looking around trying to make sense of things.
DANAH BOYD: So for example, the biggest challenge for Facebook is not actually young people. The biggest challenge for Facebook is boomers, who are adopting nearly any conspiracy that they get their hands on right now. And they're actually the most conspiratorial in ideology. With young people, it's actually a different challenge. The main challenge for young people are young people who are really disillusioned and feel as though they have no opportunity-- so the young people who don't feel like they can get into college, or if they get to college, they're going to have huge loans, the young people who just feel as though upward mobility is collapsing in on them.
DANAH BOYD: And they want to blame any number of things around this. And that's one of the reasons that I believe, for us as adults, we have to look not just at what the young people are doing in challenging this-- and they are. They're speaking out against it. And we're certainly seeing the conversation, for example, around debt become a huge part of our political ecosystem. But also to realize that if we don't build a set of structural interventions to make more opportunities available for young people, the potential for them to become more radical is real.
DANAH BOYD: And that's another thing that we can certainly look at the 20th century Europe as a marker on this, which is that when youth do not feel as though they have opportunity, when they feel as though nothing good can come, then you start to see them embrace more and more extreme positions. And so that's what I sort of worry about, and where I come and see an opportunity for danger. And I say that because it's not going to be the vast majority of young people who are leading it, it's just that that is an ever-present reality.
DANAH BOYD: So this is the other thing I will say-- one final point-- which is that, as adults, we have socialized now two whole generations into a tremendous amount of fear, which is that we constantly tell them about all the dangers in the world. And there are very real dangers in the world. But the result is that there is a feeling of true uncertainty, true feelings of paralysis in opportunities.
DANAH BOYD: And so this is where I say it behooves each of us, as an adult, when we're really engaging with young people, to work hard on creating and opening opportunities, rather than trying to restrict young people's movements, their ability to engage and come of age. Because if they don't have access to publics in a full adult sense at a younger age, they will create their own publics. And those aren't necessarily the most positive.
DANAH BOYD: So I say that because it's not, to me, about hope or fear, it's about all of it at once. And I think that, for me, I don't want to put all of the burden on young people to fix the problems that we adults have created. I think we have to take a real responsibility for having built these structures, and we have to work as adults to fight back on their behalf.
JASON GRIFFEY: All right, thank you so much. danah, we are at time on our end. So I'm going to let everybody give you one more round of applause, and hope you can hear it.
DANAH BOYD: Thank you. [APPLAUSE] [MUSIC PLAYING]