WordPress.com

 The PreCrime UnitBarry: Every Tuesday in downtown Los Angeles, the LAPD police commissioners have a board meeting that starts at 9:30 in the morning.Tape: Please call the roll.Good morning, let the record reflect that Commissioner’s Bonner, Decker, Soberoff, and Goldsmith are present and we have a Quorum.Barry: There are five police commissioners, all civilians and volunteers, who are appointed by the Mayor. The meetings are open to the public, and there’s time allocated for public comments on every agenda item. Anyone who requests it gets 2 minutes to speak to the board. There’s a stop clock right in front of the podium that starts a countdown as soon as you start, and a bell goes off when your time is up, then your mic gets cut off.SoundBarry: The last meeting of the year, 2018, was in mid-December, right before the holidays. The first item on the agenda couldn’t sound more boring and bureaucratic. The Commissioners are being asked to approve a donation from a charitable organization of $35000 to the LAPD.Tape: Soberoff: To be used to reconfigure the existing conference room into a Community Safety Operations Center for the benefit of operations, West BureauBarry: Steve Soberoff is the President of the Police Commission, and he’s presiding over the meeting. There are Four LAPD Bureaus, Central, West, South, and Valley. The Community Safety Operations center, or CSOC for short, started in South Bureau as a response to a rise in violent crime in South Los Angeles in 2016. Think computers doing data analytics, centralized intelligence sharing, that kind of thing. After a couple of years, South Bureau saw a significant reduction in homicides and gun violence, so the mayor, Eric Garcetti, pledged to spread CSOCs to every other bureau in 2018. Time for public comments.Soberoff: Go ahead.MusicTape: Do you know what a CSOC is, that you’re going to approve $35,000 funding?Tape: What these operation centers are used for are to gather and store and analyze and share-Tape: License plate readers, body cameras, cc TVs.Tape: Surveillance data which is then used to criminalize members of the community.Tape: And from there, individuals are identified and targeted.Tape: CSOC, they prime officers so that they are more likely to use force.Tape: You are sending officers out into the field frightened and creating this use of force that is resulting in people being killed.Tape: And that is where CSOCs comes in because they are the central nerve centers.Soberoff: Next speaker please. time's over now. next speaker please.Barry: There’s probably about 40 people in the room, including myself, flanked by police officers on both sides, and 11 people spoke against the approval of CSOC funding. Many were members of the StopLAPD Spying Coalition, a group of community organizers who have been suing the LAPD to release document about their computerized surveillance policies. Afterward one of the police commissioners, Shane Goldsmith, addresses the concerns before moving for a vote.Goldsmith: Um I will vote on this, approve this, but I did want to just acknowledge the concerns that have been raised-Soberoff: It's her turn to speak now.Goldsmith: My turn. Thank you. I know that nothing short of abolish of the police department and this commission will satisfy you.Soberoff: We have a motion for approval please.Tape: Moved.Soberoff: Okay, we have a second?Tape: Shame on you!Soberoff: All in favor? Aye. Anyone opposed?Tape: [shouting against]Tape: A sum of 4-0.Tape: So I'm gonna ask you all to stop disrupting the meeting. If you can't stop disrupting the meeting, then the meeting will stop anyway. Barry: The approval of funding for the West Bureau CSOC is a small but symbolic step in LA’s ongoing move toward predictive policing technologies. The goal is to have computer programs predict who, where, and when the next crime is going to occur, and to direct police units to intervene and prevent it. At this meeting the CSOC represents for one side how a once notorious police department can turn to technology for progress and reform, replacing the prejudice of human judgment with impartial data and algorithms. For the Stop LAPD Spying Coalition, algorithmic objectivity is a fiction, a cover. For the coalition, the CSOC is just another efficiency tool to target, incarcerate, and control racial minorities in a rapidly gentrifying city. It's a debate that will eventually spread across the country, because the technology is moving at a rapid pace, and police departments everywhere are looking for an upgrade. One way to anticipate how cities around the country will react is to look at how the debate is unfolding in LA.Tape: Your'e rubber stamping CSOCs an you're rubber stamping the same goddamn policies, your'e speaking from both sides of your fucking mouth and-Soberoff: Speaker please.Tape: So what you're doing is-Soberoff: Your time is over.Tape: Shut up.Soberoff:You're disrupting the meeting.Barry: At this point about six police officers in the room start moving in, and they open a digital video recorder that one of them has been holding the whole time and they start filming.Tape: Of course we want abolition of policing, of course we want abolition of here-say. Shame on you Shane Goldsmith and you're a fucking president of a free-Soberoff: Ladies that's your'e last warning or you're both gonna leaving.Tape: Shame on Shane! [repeated chanting]Soberoff: You're both gonna be leaving. Both of them are out. Thank you.Tape: [continued shouting]Soberoff: Thank you. Next speaker please. Other people who'd like to speak? Other people follow the rules and would like to speak. You're taking your time.Tape: You're a fake you're a fraud.Soberoff: You're out of the meeting. You're continuing to disrupt.Tape: Fuck you, Soberoff.Soberoff: Happy holidays to you too.Tape: Billionaire piece of shit.Soberoff: Next speaker.Tape: The next speaker is Adam Smith.From Slate this is Hi-Phi Nation, philosophy in story-form. Recording from Vassar College, here’s Barry Lam.Barry: Steven Spielberg’s adaptation of the Philip K Dick story, Minority Report is now 17 years old. Given what we know today, parts of it were prophetic, parts of it absurd. The film tells the story of a future where Tom Cruise pieces together psychic predictions of violent crimes on a high-tech computer, then sends out a team of cops to arrest and jail the perpetrator before the crime occurs. They’re called the PreCrime Unit. Real life predictive policing is here now. The CSOC that the LAPD commissioners funded is the real life equivalent of Tom Cruise’s control room. Other than that, actual predictive policing programs show just how unrealistic the movie was. The crimes in the Minority Report were all bourgeois fantasies; murders of cheating spouses, conspirators, and child kidnappers, all of them had affluent white victims and white perpetrators. In real life, predictive policing technologies target property crime, drug dealing, gun-violence associated with gangs, the kind of things affecting communities of poverty and color.The other piece of fantasy in Minority Report is what it depicts as the central problem with predictive policing. People have free will, the psychic’s might have been wrong. In the real world, free will isn’t the issue. The real philosophical problems are more basic, maybe even harder. And there aren’t any psychics. Just statistical science based on a little criminology, a little anthropology, a lot of data collection, all pulled together by the developing fields of machine learning and artificial intelligence.In the next two episodes I’m guiding you through the use of statistical algorithms in criminal justice from the streets to the prisons. Lots of forces are pushing to replace human judgment with computerized ones, and its happening at a time when the rules aren’t known and what counts as justice isn’t obvious.Meredith’s drop: Hi-Phi Nation will return after these messages.Sarah Intro: My name is Sarah Brayne and I'm an assistant professor of sociology at the University of Texas at Austin.Barry: Sarah Brayne is going to be my guide. Sarah embedded herself for years with the LAPD, doing ride-alongs, observations, interviews, studying how these new technologies are changing the relationship between the police and the community.Barry: Sarah, let's start with the basics. What's PredPol?Sarah: Sure. PredPol is a location-based predictive policing software so that's used to predict where and when property crime is likely to occur in the future. So essentially it takes three kinds of inputs that are all part of historical crime data, when, where, and what type of crime occurred. More recent crimes are weighted more heavily and then it outputs these five hundred by five hundred square foot boxes where crime is more likely to occur in the future and then police officers are given these print-outs or these images at the beginning of their shift and told to spend time in those predictive boxes.Barry: Tell me the range of things officers do in the field with that kind of information.Sarah: I mean essentially they drive to those boxes and then they check in and out of those boxes during uncommitted time, meaning like if the officers were not responding to a call, for example, or at the station booking somebody. They were told to spend their uncommitted time driving to those predictive boxes and basically looking around and seeing if anything was happening there. Um and of course intercepting if they saw a crime in progress, but a lot of it is just this like deterrent strategy of if you're sitting there in this high-crime area, somebody who maybe was gonna steal a car, if there's a cop car sitting right there, wouldn't steal that car.Barry: It doesn't sound really dramatic at all. It sounds really boring and just what you would expect people to do. Like before this you would just guess, like i guess that's a chop-shop over there or something.Sarah: Yeah exactly. That's the thing is I think that like the more you actually learn about predictive policing, the more you're like, "Oh okay that actually..." It's not that different then what they were doing before.Barry: and it’s true, even Stop LAPD Spying Coalition admits that these predictive policing technologies are just continuations of existing practices that have long been a part of patrolling. The important question is whether that's a good or bad thing. Advocates of location-based predictive algorithms like PredPol claim to have a big advantage over ordinary person-based criminal profiling. PredPol ever uses social categories like race, gender, age or criminal history. In fact, location-based systems don't use any identifying information about individuals and predictive software is a lot more transparent than, say, the human mind. PredPol's equations, algorithms, controlled studies are all published peer-reviewed academic journals. PredPol uses the same predictive models that geologists use to forecast aftershocks after major earthquakes. It's not clear why that works though, but that's actually one of the points of using predictive algorithms. If a forecast is accurate, it doesn't matter why.But a criticism of PredPol is that it makes law enforcement stuck forecasting crime in the same known neighborhoods and locations, making particularly places feel occupied, or overpoliced. And the crime history data that PredPol uses also comes from law enforcement itself. And law enforcement has conflicting incentives that can affect the accuracy of their data. But things are starting to get fancier. PredPol is version 1.0 of location-based predictive policing, using crime to predict crime. There’s no reason why algorithms have to be so limited.Flora: Hello.Barry: Hello.Flora: You hear me alright?Barry: Yeah I hear you. Can you hear me?Flora: I'm Flora Salim, I'm a senior lecturer at RMIT University School of Science.Barry: Machine learning researchers like Flora Salim can presumably use any data they have to see if a machine can find positive correlations with crime.Flora: We came up with a certain handcrafted features that we actually extracted from the check-ins data.Barry: Salim and her colleagues got their hands on check-in data from the mobile app Foursquare in Brisbane and New York City. If you don’t know about foursquare, its an app you use to learn about restaurants, attractions, events and so forth. For awhile it kept check-in data on all of its users. The data is anonymized, there’s no identifying information. But you do have information about an individual’s check-in history. And you can see how many check-ins are at a location and how that changes over time.F: So we look at the number of venues and the number of check-ins. We look at the diversity of these check-ins across locations.Barry: There are a lot of interesting things you can measure using just check-in data. One example is diversity. For instance, you know whether a group of people tend to be into the same things based on their similar check-in histories. So if you can see that that a lot of these people congregate at location X, then X is a homogenous location. If you find a location where people with very different check-in histories are congregating, then it’s a diverse location. Another thing you can look at is the ratio of newbies to regulars at a place. And these things change over time as people are moving around the city.So what you do is give the computer all of this check-in data. Then you feed it crime reports for crimes like assault, unlawful entry, and drug dealing. And you do it with data for a set period, like six months, and let the machine scan all the data, cut it up, look at ratios and changes, and determine how check-in patterns are correlated with particular crimes. This is the training phase.Once that’s done, the computer has come up with its best model for predicting future crime, and you move on to the testing phase. You give the computer new check-in data that it hasn’t seen before, you ask the computer to make a prediction about where and what kind of crime is going to occur. Now here’s the really ambitious part of the experiment. Salim and her colleagues were looking to predict new crimes that are supposed to happen within 3 hours of check-in. Their goal is to give officers real time data about where to increase their patrols. PredPol predicts crimes that are supposed to happen within the next day.To see how well your algorithm does, you compare the computer’s predictions with the actual crime reports that happened within the next three hours.Flora: You know about PredPol right?Barry: Yeah, absolutely.Flora: We've improved it much more. We've managed to have improvement of up to sixteen percent.Barry: Here are some of the patterns that the machine found.In general, as locations become less popular, crime increases. Crime is likelier to happen as locations start getting newer or more infrequent visitors. And, as a location becomes more diverse, the likelier you're gonna have crime in the next three hours.What humans tend to do is come up with a theory, an explanation of why things pattern the way they do, but this isn’t always a good thing. I once gave two lectures about this stuff. Where in the first one I told the audience the real data; crime decreases as a place becomes more popular. The audience immediately came up with an explanation; more people make it harder to get away with a crime, so criminals target quieter areas. But in the second lecture I said the opposite, that crime increases as more people check in. And the audience immediately had an explanation for that; more people mean more opportunities for crime.This is a virtue and vice of human judgment. We’re good at explanations, and we’re good at being convinced of our explanations, letting them guide our thinking whether its true or not. The machines don’t do this; they only make the predictions, leaving the stories for the humans.Now, AI researchers aren’t so na?ve to think that their algorithms are unbiased or story-free. The claim though is that they’re far less biased than people, who get too caught up in the stories they tell themselves and often ignore data completely.Salim and her colleagues were able to get accuracy without needing crime histories or locations and times. The mere movement patterns were enough. Who knows how accurate things can get when you start putting all of this data together, or bring in new kinds of data.Tape: To be very hypothetical, we we wear these fitness trackers, a lot of them actually have sensors that track our heart rates. It also tracks your mood and your emotions. If you can correlate a lot of things, for example, your galvanic skin response, which is basically how much is sweat and all that, that also can tell us some signals to do with your stress level. Even something more fine-grained on that level can potentially even boost accuracy even more.Barry: A lot of people I talk to get creeped out at this point. But fitbit data has already been used to solve crimes, pinpointing exact times of death. One sociologist even proposed that we use Fitbits monitor the police, to predict those that are likeliest to have unusually high stress responses that too easily use force.It’s a hard question for all of us how much data about ourselves we’re willing to give up in the interest of public safety. But the unfortunate reality is that some communities have more power than others in settling this question. Government officials, police forces, affluent suburbanites, they generally win fights over how much they get to be surveilled. The actual issue is probably how much the affluent will sacrifice the privacy of the poor to secure their own safety. And on the ground, these aren’t just theoretical concerns.Tape: [protest chanting] What do we say to PredPol? Shut it down! Hey hey ho ho CSOC has got to go.Just hours after the police commissioner’s meeting, the Stop LAPD Spying Coalition got together at their headquarters in Skid Row to organize a protest and occupation of the CSOC located in the Central Bureau.Tape: We leave for the precession at 3:30.Their goal was to have a protest group march to the Central Police Station and have a smaller group beforehand enter the station and demand to know if a CSOC was present, and if so, to have its commander publicly confirm or deny the many practices they have on record as part of LAPD predictive policing.Tape: By the time you guys start marching, we're inside already shaking them up.Music:Barry: They were planning to do this in just two days and I followed them along, and followed them in.Tape: We're gonna go inside.Tape: It's about the dismantling of CSOC!Tape: So you are denying that a CSOC exists?Tape: We will use those statistics to determine where we need resources.Tape: Shut it down!Break: We will return to Hi-Phi Nation after these messages.Barry: If you're a new or a long time listener of Hi-Phi Nation, you'll know that we just joined Slate. I wanted to let you know that this season there's going to be bonus content exclusively for Slate Plus members. Slate Plus Members get ad-free episodes of all Slate podcasts, including this one, and this season I'll be producing bonus segments, outtakes from the show, extended interviews, and original panel discussions with me and a guess on the philosophical issues from an episode. You could sign up for Slate Plus by clicking the link in the show notes: hiphiplus. Get two weeks free and thirty five dollars for your first year. That's hiphiplus.Pancake’s Chant: [protest chanting]Barry: the other predictive policing program the coalition is protesting is Operation LASER, LAPD’s person-based predictive policing program. This particular program is far more secretive.Tape: We want them to come out, answer questions the community has.Barry: Jamie Garcia is one of the lead organizers of the Stop LAPD Spying Coalition. She’s leading the team that is entering the Central Station to expose the CSOC, and to ask its commander whether there is a secret list of names rumored to be part of Operation LASER. This list, the Chronic Offender Bulletin, is supposed to be a daily list of people who are predicted by an algorithm to be that day's likely offenders.Tape: I'm with the Stop LAPD Spying Coalition, I was told that there's community safety operation center that's here. Is that true?Tape: Not that I'm aware of.Tape: So are you denying that a community safety operation center is here?Tape: Not denying, just not aware of it.Tape: CSOC is a racial program used to collect data on black and brown bodies. We want it shut down for the simple fact that it is contributing to the execution of our folks in our communities.Barry: The captain in charge of the CSOC has decided to come down and is willing to talk to the community about it.Tape: We had run an operation which is merely a group of people to get together, look at where the crimes being committed and we make determinations where to best use police resources. That is what you're referring to as the CSOC that current practice is not currently in use at the moment. Tape: So are you telling us that there is no secret list of chronic offenders?Tape: No. I don't have any list of chronic offenders in practice right now.Tape: [chanting] shut it down! shut it down!Barry: Tell me about Operation LASER.Tape: Operation LASER is a person and place-based predictive policing program.Sarah: Which stands for Los Angeles Strategic Extraction and Restoration.Barry: Sarah Brayne, sociologist at UT Austin. Sarah observed the practices of Operation Laser, and she considers the most important element of this predictive program not to be some fancy AI learning program, but a simple index card called an FI card.Sarah: They're these really important data collection tools, and funny enough they're used for all these other things too, like they're just these small index cards, so they're used to like stuff a lock in a gate if you don't want it to lock behind you when you're going into a house, for example. Anyway, these FI cards are everything and that's basically where you write down any information that comes up in the course of an interview with a civilian.Jamie: Those cards are being used to map communities, to find out what kind of informal social networks exist in communities in order to disrupt them.Barry: Jamie Garcia of the Stop LAPD Spying Coalition.Jamie: In order to find out where and through their own language is where goods and information gets most exchanged through. Finding those nexuses and disrupting them, regardless if you are deemed a perpetrator or a victim, even being in the surrounding area, being a witness opens you up to being mapped as well.Barry: This isn’t a just suspicion or paranoia. The field interview cards contain information about who is in the car with someone during a stop, who might be across the street, the neighbor who walked by. People who aren’t questioned by the LAPD but are observed to be in the vicinity can be put onto an FI card and then entered into the system. Car information, make, model, condition, they’re all recorded too. And what’s new is that all of this information daily is entered into the system, which officers can then run through software called Palantir. Palantir can then give you a social network map for an individual, who in the past they’ve been seen with, cars they’ve driven, where in the neighborhood they’ve been stopped at and so forth.Sarah: Even people that have never talked to the police and have no direct police contact are included in law enforcement databases now. You can't drive your car on a street in LA without being picked up eventually by an automatic license plate reader.Barry: But it is what LAPD is doing with all of this social network and mapping information that is most concerning to the Stop LAPD spying coalition. They'r put into a computer algorithm to generate a daily list.Jmie: The chronic offenders and chronic offender uses what's called a risk assessment, so you get five points of you've been arrested with a handgun, you get five points if you have a violent crime on your wrap sheet, you get five points if you've been on parole or probation.Sarah: And then five points for gang affiliation. And then people get one point for every police contact. So every time the police stop somebody and fill out a FI or a field interview card on them, individuals get another point added to their score. And then just like with the place-based predictive policing, where officers at the beginning of their shift are given these printouts, similarly, officers are given these lists of what are called chronic offenders, people that have the highest points values in a division that day. And you'll just sort of drive to a park say that's like a known area for drug deals and you'll say, oh hey like some of these guys are on our list let's see if we can go talk to them, do a consensual stop, try and collect some intelligence on them through that because-Barry: Will they fill out a card?Sarah: Yes they would fill out a card for that.Barry: So they would look for somebody who's on a list and then fill out a card after another interaction with them?Sarah: Yes.Barry: Okay. Does that mean they get another point?Sarah: Yes.Barry: Should I be worried about that. I mean like it's ringing bells.Sarah: I mean you can see how that might turn into somewhat of a self-fulfilling prophecy for sure or like a feedback loop, if you will. Where if you're going out and specifically seeking out the people with high points values, and then you go and stop those people, and then that increases their points value, that can very quickly lead to a feedback loop.Barry: It sounds like it's more than can. I mean it seems like it does, right?Sarah: So I don't actually have the data on people's points, but like I would definitely think that would be the case, yeah.Jamie: We did get through a public records request one chronic offender bulletin release, where we can actually see what it looks like. On that request, you can see a person being stopped about three or four times in one day. At that point, that person already has four points on them. We're able to access some part of a list from McArthur Park of people who were identified as chronic offenders that people as little as that they had tallied and were as young as nineteen.Music.Barry: Official LASER policy is that you get a point for a field interview only if it's a “quality” police interaction, but here’s no definition of what that means. The E in laser stands for Extraction and that’s the ultimate goal. Getting the people on the chronic offender lists out of the neighborhood, before they commit another crime. And there’s another feature of LAPD data collection. All of the crime data allows you to identify addresses, businesses, street corners, that seem to have high frequency interactions with police. They’re called crime anchor points. Just like there's a goal of extracting a criminal with a high risk score from the area, there is also a goal to ridding an area of an anchor point, where LAPD works with the City Attorney to evict or relocate people through the city-wide nuisance abatement program.Jamie: I think that what's insidious about this when you get an eviction notice, you cannot, it's impossible to rent again, you become stigmatized. And so, especially in a city where affordable housing is almost not possible to fine, especially in a city where the homeless population is somewhere around fifty five thousand people, they are now using this kind of pseudoscience to demonstrate that a property or people in a property should be removed. That't the component of it where it's like they're killing us softly now. MusicBarry: The designer of Operation Laser turned down my request for an interview, but I think I can charitably reconstruct his reasoning. A very small number of people are disproportionately responsible for crime in a neighborhood. When you strategically extract high probability offenders in a laser-focused way, you’re making neighborhoods safer with less collateral damage. And you’re doing it in a race-neutral way by only calculating probability of future offense using data that is correlated with future crime. If the consequence happens to be that young black and brown males are the only ones affected, it’s because they’re the ones satisfying all the race-neutral conditions of being likely offenders.Jamie: When we look at all the different elements that are used to calculate the risk-assessment of a potential chronic offender, we found in 2017 the black community was five times more often arrested than the white community. When you look at stop, which essentially lead to field interview cards being filled out, the black community was five times more likely to be stopped. Even in parole and probation, we're finding more and more studies that are identifying that the black community is more often to be on parole or probation. So even though you claim race-neutrality, data can stand in as a proxy for race. The other perspective has some merit too; many of the things that get you points in the algorithm are things under police control; if police want to give someone points, they just have to start talking to them, and then call it a quality interaction. Gang-affiliation is another example. Who gets recorded as affiliated with a gang member? An officer could connect just about everyone to a gang member, if a neighborhood has gangs in it. When I was young, I lived next to gang members, rode the bus with them. I mean every teacher in LAUSD probably teaches a gang member. So who gets the points for being affiliated with a gang member after an FI interview, the skinny Asian neighbor or the Latino neighbor? And finally there’s the issue of civil rights. How can LAPD legally extract someone for just having been predicted by an algorithm to be a future offender?Did you observe anything that looked like predictive arrests, a kind of intervention that seems like you would need some kind of probable cause or reasonable suspicion? Sarah: Yeah. I didn't see them you know violating requirements for reasonable suspicion or probable cause, for example. It was really more of these consensual stops. It's not illegal for a cop to go up to anybody and start talking to them. You can say, "I don't want to talk to you" and walk away, but if you are a known gang affiliate and you're on parole or probation, you don't have the same ability to walk away. But then in the course of a consensual stop, you might then see something that would constitute individualized suspicion to give you reasonable suspicion to actually then question someone.Barry: I mean I guess another way of putting it coming from the other side is, why isn't a high score enough for individualized suspicion or probable cause, right? I mean that's another way of putting it.Sarah: Yeah. I mean and I think this is like kind of, just as much as it's a legal question, it's kind of this like philosophical question around like what is individualized suspicion? Predictive policing is really probabilities that this person has a higher probability of committing a crime in the future based on past data than this other person. Whereas individualized suspicion is not supposed to be probabilistic, it's supposed to be observable then and there. So just because somebody has done something in the past, doesn't give you individualized suspicion now in the present. But actually like I think that those ideal typical categories are kind of like getting conflated now in this like world of predictive policing. Like what even is individualized suspicion? Is your same action interpreted as more suspicious if you're inside a predictive box? Or if you have a high risk score than if you don't? I think that this is like this binary is not really there anymore. It's it's eroding at least.MusicBarry: There's a thin line separating the legitimate and illegitimate use of police power. We give it names like Individualized suspicion, Reasonable suspicion, and probable cause. The biggest change from predictive policing as I see it, isn't whether it's going to lead police to profile, identify, and target particular people and locations, that's just what they do. The most consequential question is whether it's going to lead to a revolutionary change in the standards reasonable suspicion and probable cause and whether that is going to be morally legitimate.Sarah: You're not allowed to stop somebody just because they're black right. But you are allowed to talk to somebody because they have a high point score. Now that it's quantified, it's like more difficult to contest legally or to really put your finger on where exactly is the bias there.Barry: Suppose you use guns and ammunitions purchasing data and past history of domestic violence to determine likely mass shooters, and the algorithm spits out a few names. Is that sufficient suspicion for a stop, search, and surveillance? Or better yet, the same algorithm use to determine who is likeliest to commit a crime is used to offer targeted job, educational, and social services. Can you make a claim of statistical bias here when it's the same statistical algorithm doing the work?Prior to the era of big data, the courts have been very skeptical of allowing statistical evidence alone to satisfy legal standards of justification, something I’m going to cover on the next episode. But the Supreme Court has already signaled that they need to revisit all of these questions in the era of big data.Renee BollingerRenee: I'm Renee Bollinger, I'm a postdoctoral researcher at the Australian National University. Barry: Renee Bollinger is one philosopher among many who is thinking about the connection between statistical evidence and what people do with that evidence, particularly in matters of race and gender profiling. There’s a seemingly unquestionable assumption that what statistics give us is what is likeliest to be true, and the likelier something is statistically, the more justification we have for treating it as being true.Bollinger argues that this isn't always the case.Renee: What people don't tend to realize is that even for probabilities that clear our threshold for a reasonable belief, so it's 90 percent likely or 80 percent likely or something quite high, if the error costs are high enough, you shouldn't treat it as true. To do so would be to risk mistreating a particular person.Barry: For example, if you teach a class where everyone gets a perfect on their test and you get evidence that 90% of the class cheated, you have to decide who to punish. And statistically, if you punish everyone, you’ll be 90% accurate. But the error cost is that you’ve treated the 10% who didn’t cheat unjustly just by relying on this statistical evidence.Renee: You'll be treating them in a way inappropriate to them. So that's the risk of a false positive. And then you weigh that against the risk of a false negative, so what if it is true and you don't treat them as though it's true?Barry: Imagine that you decide to treat one of your students as though they didn’t cheat, but in fact they did, you’d let them get away with something.Renee: The point that often goes unnoticed is that very often in these statistical inference cases the costs of the false negatives, the failing to treat it as true even if it is, are not that high, while the cost of the false positives, treating the person as though this is true them when it isn't, are actually quite high. And so, you risk doing something seriously wrong to that person when it's actually false of them. So then, what that should do, I argue, is it raises the evidential bar for how sure we have to be before it's okay to accept this proposition as true of that individual.Barry: Bollinger’s view is that even a statistically sound generalization, the kind that you might get from a big data analysis, doesn't automatically allow you to cross some threshold for treating something as being true. Let’s take reasonable suspicion for example.MusicOne way to see how low the threshold for reasonable suspicion can be is to imagine 10 people, nine are known to be innocent, and one is known to be guilty of something like shoplifting. But then the one person runs into a crowd of the other nine, and they all look similar enough that an officer can’t distinguish them. Does an officer have enough grounds to search and frisk all ten to find the one shoplifter? If you think so, then 10% certainty is enough for reasonable suspicion. Now, if a predictive policing algorithm has a 10% threshold and then puts some person’s name on a list, there’s a sense in which we’ve passed the usual threshold, and so officers should be allowed by law to stop, question, and search.Bollinger’s view is that this isn’t right, even if the statistical analysis is correct.Renee: One of the big problems with using statistical evidence is that you end up increasing the chances that you'll get the wrong verdict for someone. So if you think, look, it's the nature of criminal justice system that sometimes we will falsely convict someone, sometimes we'll sentence someone to far longer than they're due in some sense. That's fine, but you should be worried when they start to concentrate on particular groups. And the trouble with statistical evidence is that it tends to reinforce these sorts of concentrations.MusicBarry: One of the examples philosophers use to highlight this is an episode involving John Hope Franklin, the eminent African American historian. Franklin was at a fancy banquet in his honor at Washington DC, where it turned out that all of the service staff was black, and the guests were mostly white. Now statistically, at this banquet, if you’re black, you're likelier to be a service staff member rather than a guest. Now suppose a rich white lady decides that she needs to hand her coat to one of the staff, but she doesn’t know what John Hope Franklin looks like. Now she is statistically in the clear if she just makes the assumption that any black male she sees is a staff member. But she shouldn’t do that. She is risking the mistreatment of John Hope Franklin, and in fact, contributing to an ongoing mistreatment of him. Because it turns out that making the statistical generalization will always mean that John Hope Franklin spends his life being assumed by others to be the help.The analogy is that in the criminal justice system, even if the statistical evidence crosses the proper threshold, say 10% of the young black males in a location are responsible for all of the property crime, using that stat as sufficient to stop and frisk young black males means you are subjecting all of the innocent young black males to ongoing mistreatment, and that’s 90% of them. And this will hold if you flip the numbers. So if 90% are responsible for the crimes, accepting the generalization as true means you’re subjecting all of the innocent young black males to ongoing mistreatment, 10% of them. Renee: And so then members of that group face just a higher risk of suffering the false positive. And that's a fairness based consideration against using this kind of evidence.Barry: It almost sounds like the higher the statistical generalization, the more we shouldn't believe of any individual. I mean it's paradoxical but it sounds that way. So if you're part of the 30% minority and 70% there's something true about them, you're likelier to be treated as though that you're the 70 than you are if it was like a 50/50 case. So it's like the injustice is likelier, which makes belief even harder to get to.Renee: Yeah yeah, so as I get these these more imbalance probabilities, one of the other things that that's tracking is that it's a more and more commonly held assumption about this group of individuals. So they are exposed to the risk of this error more and more often and it's shaping more and more of their life. And so yeah as these disparities start to grow, that's a moral reason to not rely on them and to not use them as the basis for belief. Barry: Wow. I mean that's a very interesting and strangely paradoxical conclusion right? The more likely you are statistically to have some feature, the less likely it should be for other people to believe that you have that feature.Renee: Yeah, so long as the feature is one that is morally weighty. So if it's just a question about how likely you are given that you're British to like cricket, you know maybe not a lot hangs on that so it might be fine. But yeah if it's how likely you are given that you're black that you might be a criminal, the higher those probabilities go up, the more moral reason we have not to base beliefs on them.Barry: Bollinger and many others like her argue that its possible for a statistical generalization to be both accurate and unjust. Sometimes, the more accurate it is, the more unjust it can be. Because the better your statistics, the better the chances that you’ll treat the exceptions unjustly, and she’s not willing to take that risk.MusicBarry: Why is risk exposure itself bad?Renee: People who have been exposed to risks have a lower well-being as a result of that directly. If you know that you've been exposed to a risk, then that kind of a lot of effects on your behavior. You can do things like avoid other sources of risk in your life or do things that are likely to mitigate the harm that you've been exposed to.Sarah: I've found that individuals who have been stopped by the police or arrested, and definitely those that have been convicted or incarcerated, are systematically avoiding institutions that collect data on them. Surveilling institutions where the police might be able to access that information. And these are really important institutions, like hospitals, banks, formal employment and schools. And so if you have this whole swath of people that are avoiding these legitimate institutions out of concern of law enforcement surveillance that's impeding their you know upward economic mobility, their social integration, all these kinds of things. And so I think that like we really need to think about not just what the benefits for increased police efficacy can be, but like what the chilling effects of surveillance can be too.Barry: Next time, on hi-phi nation.Tape: One in five of all people in jail and prison in the United States are people that are detained before trial.Tape: People plead guilty to get out of jail. And everybody knows this how the system works. We pretend that it's about justice, it's not.Barry: I follow predictive algorithms into the courtrooms, and even prisons, to see how they’re being used to determine who to incarcerate.Hi-Phi Nation is written, produced, and edited by Barry Lam, associate professor of philosophy at Vassar College. For Slate podcasts, editorial director is Gabriel Roth, senior managing producer is June Thomas. Senior producer is T.J. Rafael. Production assistance this season provided by Jake Johnson and Noa Mendoza-Goot. Visit for complete show notes, sound track, and reading list for every episode. That's . ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download