Mathew Tokson: We think this is a social norm, and therefore we’re going to set Fourth Amendment law based on that. And I think that can have a lot of problems, problems related to gender or race or class.
Diane Maggipinto: This is 3 in 5. I’m Diane Maggipinto, the host of this podcast. Three questions in five minutes from the S.J. Quinney College of Law at the University of Utah.
My guest today,
Mathew Tokson: My name’s Matthew Tokson, I’m a professor of law at the University of Utah S.J. Quinney College of Law.
Diane Maggipinto: Your article is called Social Norms in Fourth Amendment Law that you worked on with law and computer science Professor Ari Ezra Waldman. Tell us more about it. What are the main takeaways?
Mathew Tokson: Courts often rely on social norms and practices and Fourth Amendment law.
They often try to look to something sort of external to the law in order to determine what, for instance, is a reasonable expectation of privacy under the Fourth Amendment. And that, in turn, dictates what’s a search under the Fourth Amendment. Our critique of this practice is twofold. First is that courts often consider social norms to be sort of closed.
We call this the closure principle. You know, society goes through this period of struggle and figures out what its social norms are and then that’s it for foreseeable future. And we just don’t think that’s an accurate portrayal. Oftentimes older norms can embed discriminatory, sort of concepts and practices.
And just sort of outmoded thinking. Let me give you an example of how courts are using social norms in shaping fourth amendment law. So a classic case where this happens is called Georgia vs Randolph. And in that case, Justice Souter a justice who I was hired by at the Supreme Court and who I love dearly wrote an opinion that basically said, there is a social norm that says if you’re at someone’s doorstep and there are two people there, one of them wants you to come into the house. One of them wants you to stay out. Politeness, social norms, dictate that you have to remain outside the house. You wouldn’t feel welcome going in there. I’m not saying that he necessarily got that norm assessment wrong, but you can imagine some issues with basing the Fourth Amendment law on that social norm assessment, where if a wife wants the police to come in and maybe intervene in a domestic dispute situation, the husband does not, and it’s not an obvious emergent situation, uh, in which case the police could go in, well, the police aren’t going to necessarily go in. They might not be able to go in to prevent, um, a later altercation. So it can have a gendered effect that I don’t think the court really recognized or grappled with in that case.
So that’s a classic example of the court, very overtly, relying on, we think this is a social norm and therefore we’re going to set Fourth Amendment law based on that. And I think that can have a lot of problems, problems related to gender or race or class. Again, I’m not saying that that’s an incorrect assessment of norms, but I do think it’s a problematic way to set your Fourth Amendment law.
Another problem with relying on social norms is what we call the non-intervention principle. This is when courts will wait for social norms to develop and courts will say, well, we don’t really know what the norms are around, you know, text messages or web surfing or whatever new surveillance technology. That can be problematic in a couple of ways.
By the time that norms begin to emerge at all, or become at all clear, you’ve sort of given the government years or decades to surveil people without any constitutional check. Then the other issue there is that, when you just sort of leave it to the private sector or non-governmental actors to develop norms, you’re giving a lot of power to entities like tech companies and other actors that have a vested interest in having less privacy and shrinking privacy norms. Those are the folks who are going to be shaping norms and practices in society.
Diane Maggipinto: And yet, in your paper, you say that courts have just sort of declined to review a lot of cases that have to do with new technology.
So what’s going on there?
Mathew Tokson: Uh, when courts are confronted with these new technological questions, I think they, they feel very comfortable with sort of punting and deciding to pick it up later when they fully, sort of, grasp it. That just doesn’t really work if you want robust regulations of government surveillance.
If you’re going to robustly regulate new surveillance technologies, which are often very invasive and gather a whole bunch of private information about people, you’re going to have to act fast.
Diane Maggipinto: What do you propose? What are the solutions in your paper?
Mathew Tokson: One of the things that we suggest is that courts might apply a different form of stare decisis in cases involving new surveillance technologies.
Where we’re asking courts to intervene so quickly, we recognize that they may make errors in doing so. And indeed, that’s what they are so reluctant to get involved about. They don’t want to mess things up. I think a different conception of stare decisis in cases like this might make courts a little more comfortable with intervening early because intervening early is so necessary, but it is difficult to get this stuff right.
These are complex questions and complex technologies. And so courts should permit themselves to revisit their decisions, especially in this area, a little more than they have. I also think that the court might respond to the idea of being a bit more forgiving of itself when it does intervene quickly, you know, if the entire legal profession and if black letter law itself says, look, these are difficult cases, but they require early intervention, but we’re going to forgive the court if it messes up, where we sort of embed a certain compassion and forgiveness for the judges. Time will tell.
Diane Maggipinto: That’s 3 in 5 from the University of Utah S.J. Quinney College of Law.