Thumbnail Image
Risa Goluboff, Danielle Citron and Anita Allen

S4 E13: Fighting Racial Discrimination in Our Digital Lives

University of Pennsylvania law professor Anita L. Allen discusses her framework for stopping surveillance, fraud and exclusion targeting Black Americans online.

Show Notes: Fighting Racial Discrimination in Our Digital Lives

Anita L. Allen

Anita L. Allen is the Henry R. Silverman Professor of Law and professor of philosophy at the University of Pennsylvania Carey Law School. A graduate of Harvard Law School with a Ph.D. from the University of Michigan in philosophy, Allen is internationally renowned as an expert on philosophical dimensions of privacy and data protection law, ethics, bioethics, legal philosophy, women’s rights and diversity in higher education. She was Penn’s vice provost for faculty from 2013-2020 and chaired the Provost’s Arts Advisory Council.

Allen is an elected member of the National Academy of Medicine, the American Law Institute and the American Philosophical Society, and a fellow of the American Academy of Arts and Sciences. In 2018-19, she served as president of the Eastern Division of the American Philosophical Association. From 2010-17, Allen served on President Barack Obama’s Presidential Commission for the Study of Bioethical Issues.

A prolific scholar, Allen has published over 120 articles and chapters, and her books include “Unpopular Privacy: What Must We Hide”; “Privacy Law and Society”; “The New Ethics: A Guided Tour of the 21st Century Moral Landscape”; and “Uneasy Access: Privacy of Women in a Free Society.”

Allen served a two-year term as an associate of the Johns Hopkins Humanities Center from 2016-18. She has been a visiting Professor at Tel Aviv University, Waseda University, Villanova University, Harvard Law and Yale Law, and a Law and Public Affairs Fellow at Princeton. She visited the Government School at Oxford University in 2022, and will visit Fordham Law School in 2023 and Oxford’s University College as the Hart Fellow in 2024, when she will also deliver the H.L.A. Hart Memorial Lecture.

Allen currently serves on the Board of the National Constitution Center, The Future of Privacy Forum and the Electronic Privacy Information Center, whose Lifetime Achievement Award she has received and whose board she has chaired.  Allen previously taught at Georgetown University Law Center and the University of Pittsburgh Law School, after practicing briefly at Cravath, Swaine & Moore and teaching philosophy at Carnegie-Mellon University.

At Penn, Allen is a faculty affiliate of the Leonard Davis Institute of Health Economics; the Africana Studies Department; the Center for Ethics and the Rule of Law; the Center for Innovation, Technology and Competition; and the Warren Center for Network and Data Sciences.

Listening to the Show

Transcript

 [THEME MUSIC IN, THEN UNDER]

Risa Goluboff: Today on Common Law, protecting Black Americans from online surveillance, exclusion and fraud with Anita Allen from the University of Pennsylvania Carey School of Law.

Anita Allen: Sometimes what look like great solutions to privacy problem actually have non-race-neutral consequences.

[THEME MUSIC UP, THEN UNDER AND OUT]

Risa Goluboff: Welcome back to Common Law, a podcast of the University of Virginia School of Law. I'm Risa Goluboff, the dean. For this episode, we're welcoming back co-host Danielle Citron, a UVA law professor and director of the school's LawTech Center, which focuses on pressing issues in law and technology. Danielle received a MacArthur fellowship for her work as a pioneer in the field of intimate privacy. Her book “Hate Crimes in Cyberspace” is considered a landmark work, linking cyberstalking and civil rights. And her new book, out this summer, is "The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age." Danielle, it is so great to have you back.

Danielle Citron: Thank you so much. It's wonderful to be here and fire up the mic again.

Risa Goluboff: So who are we talking to today?

Danielle Citron: So today we're going to have the pleasure of interviewing Anita Allen, a law professor at the University of Pennsylvania Carey Law School, who's an expert on the philosophical dimensions of privacy and data protection law, ethics, bioethics, legal philosophy, women's rights and diversity in higher education. And among her many accomplishments, Professor Allen was the first female Black president of the American Philosophical Association's eastern division. And she just stepped down as the chair of the board of directors of the Electronic Privacy Information Center. And I'm taking over for her, so very big shoes to fill. Today we're going to be talking about her new paper, "Dismantling the Black Opticon: Privacy, Race, Equity, and Online Data Protection Reform."

Risa Goluboff: Sounds like a very fascinating and important discussion. We will be right back with Professor Anita Allen.

[THEME MUSIC UP, THEN UNDER AND OUT]

Danielle Citron: Anita, we’re so thrilled to have you with us. Thank you so much for coming on the show.

Anita Allen: Thanks so much.  

Risa: We really appreciate your taking the time. And your new paper is so interesting and, you know, you're really defining a problem and a set of problems for a particularly vulnerable community, and then proposing a framework for how to think about solving that problem. So can you tell us more about why Black Americans are particularly at risk online?

Anita Allen: Let me start by saying that I have been a privacy scholar for over 30 years and always wrote about privacy from the point of view of everybody's privacy, sometimes women's privacy, but all women's privacy. So this is really the first time in my career that I focused on the experience of African Americans. I think because I am African American some people assume that I've always been a scholar of African American privacy, but it is not true. And so, I'm particularly excited about this different turn in direction for me. If I could just briefly tell you how I came to take that turn.

Risa: Absolutely.

Anita Allen: The summer that George Floyd was murdered, I was approached by the International Association of Privacy Professionals to provide a, a little workshop for their board of directors on the topic ‘Privacy Through the Lens of Race.’ And I had never given a talk with that particular theme before. And so it was fascinating and a challenge and I realized how little prepared I was to give that talk. So, I pulled something together but after that talk, I became obsessed, and I developed a course for Penn Law students called Privacy Law through the Lens of Race. And then I was approached by the Yale Law Review to participate in this series of, of white papers on platform governance and equity issues, and I thought here's my moment to write about African Americans through the lens of race.

Danielle Citron: So could you talk about each piece of the Black opticon? You know, over-surveillance, exclusion, fraud, and maybe give us some examples so people can really wrap their heads around precisely the Black opticon.

Anita Allen: I coined the term “Black opticon” as a term that refers to the set of disadvantages that African Americans experience on platforms. And I came up with the idea of a Black opticon because I was having trouble myself summarizing for people the different ways in which Black people were facing a predicament of discrimination. And it consists of over-surveillance, of targeted exclusion from opportunities, and of targeted inclusion in scam jobs and frauds and deception and exploitation.

The first element of the Black opticon, over-surveillance, I also call just simply the panopticon. This goes back to Jeremy Bentham, to Foucault, the idea that some groups exercise power over others by observation. And it could be observation that is just simply aimed at control, or it could be specifically because you assume that people are engaged in deviant or criminal or wrongful behavior.

The second element of the Black opticon is the banopticon, borrowing a term from a European scholar named Didier Bigo. And the banopticon represents the, sort of the pushing of groups to the sidelines, placing people outside of the boundaries of civil society, in effect for a discriminatory exclusion, right? And when you start giving examples, people get it right away. Oh, you mean if someone's putting an ad on Facebook for housing, but they don't want me because I'm Black to answer that ad, so they try to run an algorithm that excludes me, where you're targeted for exclusion from an opportunity or a benefit that other people are welcome to have.

And then the third piece of the Black opticon is what I dub the conopticon – con jobs, scams, targeting people of color, including African Americans. So you know that African Americans are vulnerable for credit purposes, and so you target them with high interest payday loans, right? Or you believe that business opportunities are really attractive to people of color, and so you feed pyramid schemes to them that simply disadvantage them and cause them to buy a lot of stuff that then doesn't have any value to them.

Risa: It's such a big problem, right? Because these pieces are so different.

Anita Allen: Right.

Risa: It was striking to hear you talk about the origins of this piece, right? And I'm a civil rights scholar myself, right, so thinking about would this paper look different if it was a race scholar trying to understand how privacy operated, uh, that's, that's not really a question, but that's what – I'm thinking about that, right? But it does strike me that a big piece of your solutions is to say that our current way of approaching privacy problems can't fix these problems, right, because they're not sufficiently attentive to race, right?

Anita Allen: Exactly.

Risa: You're bringing our attention to race. And then one piece of it is, well, race-neutral solutions aren't going to work. So can you talk to us a little bit more about that and why that's the case?

Anita Allen: The claim the paper makes – one of the claims – is that we need race-conscious approaches to attacking the Black opticon. And the reason for that is itself a little complicated, but one important reason is that sometimes what look like great solutions to privacy problems actually have non-race-neutral consequences. Let me give you an example in the offline world. There've been a couple of instances where physicians have targeted female patients for sex abuse. So Dr. Nassar, in Michigan State, pretending to give medical exams, but actually engaging in sexual abuse. Or Dr. Nikita Levy at Johns Hopkins, who was pretending to give OB/GYN exams while surreptitiously filming his patients and then serving up the OB/GYN porn on the internet.

Risa: Ugh.

Anita Allen: We have great privacy laws. You know, HIPAA plus state common law privacy rights, which make that kind of behavior illegal. But these gender-neutral laws, right, don't take into account the fact that women are much more vulnerable to this kind of behavior than men are. In order to attack that problem, I might recommend that we have specific laws that address the problem of OB/GYN sex abuse, right? OB/GYN privacy abuses. Because those kinds of laws or guidelines – rules – would actually directly hit the problem. So instead of saying, ‘okay, nobody can discriminate against anybody when it comes to advertising on Facebook,’ how about instead having a policy or a law that says, ‘Facebook cannot sell advertising space opportunities to companies that discriminate against African-Americans, Hispanics, elderly people, disabled people.’ But NAME the groups that are most vulnerable to the discrimination to make it absolutely clear that you can't just enforce the law when it comes to straight white men who are healthy, as opposed to Black women who are poor and disabled. So sometimes I think by naming the groups that are having the disproportionately bad experience with an area of life … naming it makes it easier to address it. 

Danielle Citron: Yeah, yeah, yeah.

Anita Allen: So my African American online equity agenda goals that I outline in the paper — one of them is that we want to reduce wrongdoing and disadvantage by recognizing that laws can be non-neutral in their impact, right? So one of the things that every lawmaker should have in mind as they're drafting new rules and new laws is, ‘Is this going to have, a racially non-neutral impact?’ Also, you don't want to make things worse. If I enact this law will it actually make things worse for African Americans? If I impose more restrictions on the sharing of data, could that make things worse for African Americans? You know, some of our colleagues, Danielle, Lior Strahelevitz has argued that, actually, less privacy is better for Black people than more privacy.

Danielle Citron: Oh yeah.

Anita Allen: I happen to disagree with that, but you get my point, which is that question needs to be asked whether or not the privacy solution is going to make things better or worse.

Danielle Citron: Yeah.

Anita Allen: And then my paper recommends as part of this African-American online equity agenda that we specifically look at the issue of over-surveillance, exclusion and fraud, deceit, and exploitation reduction, as among the things we think about as guidelines for shaping new laws and policies and rules.

Risa: Would you put in the category of laws that are problematic the 2003 California voter initiative to prohibit the collection of personal data about race? You talk about that proposal in your wonderful book “Unpopular Privacy,” and I wonder if you could say a little bit more about that before we move on and talk about the new proposals in this paper.

Anita Allen: Absolutely. So the racial privacy initiative in California, was something that I wrote about. It's actually one of the few examples in my own scholarship of when, long before George Floyd and the racial reconciliation, I was thinking about race and privacy. And I, I had adopted the view that I wrote about in my 2011 book "Unpopular Privacy," that it's not clear that racial privacy is a good idea for racial minority groups in America. Maybe in Europe it's a good idea because of their particular history with the Holocaust, but in the United States, we have very little to gain if we cannot talk openly about people's races. Plus, it's in a certain way, futile to try to conceal people's race because it's in our faces and in so many ways documented in our lives. So that initiative would have made it unlawful for public institutions to collect data about race. And it was the brainchild of, of Ward Connerly, who was an anti-affirmative action, African American conservative. And fortunately the initiative failed. Lots of people in legal education wrote pieces attacking the proposed law. Patricia Williams wrote an op-ed for “The Nation” attacking the law because it was obviously going to interfere with our ability to not only direct services to people who might need them most, say Chinese Americans or African Americans, or Mexican Americans in a particular community, but it would also just create this sense that somehow we need to stop talking about racial remedies and racial rights in connection with public life. So I was glad that didn't go through. And I think that in some of the proposals today that we're facing today for new state comprehensive privacy laws, some people are jumping on the, oh, we have to have a provision that makes it unlawful to collect data about race or to process data about race. And although I think the Black opticon proposal is that we be, you know, race-conscious in our remedies for privacy problems, I'm not sure that among the solutions or remedies should be a general prescription against collecting or processing racial data.

Danielle Citron: So we've got, you know, a consumer protection law in our own state that has that provision, right, to collect sensitive data, including race. Is Virginia making that same mistake. Are you worried?

Anita Allen: I am very worried that Virginia is making that same mistake.

Danielle Citron: Yeah.

Anita Allen: But I must say, I chose in my paper on the Black opticon, “Dismantling the Black Opticon,” to focus on Virginia because Virginia is a very interesting state. So as you know, by the end of 2021, Virginia was one of three states, including California and Colorado, that had enacted comprehensive state privacy laws.

Danielle Citron: Right.

Anita Allen: Your Virginia law doesn't go into effect fully until next January, but it's an amazing, avant garde example of state privacy legislation. And I thought Virginia was a much more interesting state for me to look at than California or Colorado, because Virginia has a very large population of African Americans, I think more than 20%. It's also among the three, the only state that's a member of the former Confederacy. It has a history of Jim Crow laws throughout the state. And so it strikes me that what Virginia does to deal with privacy would certainly have a high likelihood of being, in a beneficial way, race-conscious and effective at addressing the Black opticon. So I went to look at the Virginia statute with the hope that this one's going to be really great for dismantling the Black opticon. And what did I discover? That the state assemblyman who introduced the law is an African American, right? How great is that? So an African American introduced legislation. He had, um, two or three co-sponsors in the state who were people of color. I think there was a South Asian man, an African American person and a mixed-race person. So this was a largely, you know, people of color driven effort in Virginia to get a new state privacy law. And by gosh, they got that thing through the state legislature in record time. And you're on your way. So I looked at the statute from the point of view of the Black opticon and its dismantling and my African American online equity agenda to see, well, what do I think about the law? And among my concerns was this business about how it treats race. And the way the statute treats race is it treats race as sensitive information, but it also gives people the ability to opt out of those protections. So on the one hand, you're saying, oh, there's something bad about collecting race data, but if people choose to have their race data collected, then that's a different story. As you know, Danielle, I am a big believer in legal paternalism when it comes to privacy, and I think even if people choose to give up some of their privacy, maybe they shouldn't. And if racial privacy is so important, why does Virginia let people waive that right, give it up? And then there are other features of the law that are troubling. One of them being that the law does limit things like targeted advertising and, um, price differentials, but it also gives people the ability to opt out of some of the protections the statute offers. And my fear is that the reason why this is a "business friendly" statute – your Virginia statute – is because it does give people the ability to opt out, which means it gives businesses the ability to induce people to opt out through, um, slick advertising, under-disclosure, lack of accountability, and ultimately, fooling people. (laughing) I would prefer to see a statute that was much more protective of consumers and gave consumers the real ability to have the kinds of protections that are going to be generally useful. Make fewer decisions on their own as to, well, should I choose that? Should I choose that? Should I opt out here? Should I opt in there? It provides the consumer in Virginia with a dizzying array of opportunities or obligations, right, of self-protection to opt in, opt out, et cetera. So I think that's a problematic feature of the law.

Risa: Yeah.

Anita Allen: Another feature of a law which is concerning, is that it doesn't provide for a private right of action. We privacy law experts know that many of our favorite privacy laws do not provide for private right of action. HIPAA does not provide for a private right of action when it comes to medical information. FERPA the education privacy law doesn't allow for a private right of action. So you have to depend upon your state attorney generals or somebody in the, in Washington, D.C., at the Department of Health and Human Services or Department of Education to bring your claim for you. You complain to the agency and then they take over. But the state trial lawyers in Virginia complained this provision in your state statute puts consumers at the mercy of the politics of your state attorney general's office. That if you have in there somebody who's not particularly friendly toward or sympathetic to the needs or the concerns of people of color or certain groups, they could choose not to use their enforcement powers against offenders on behalf of these vulnerable communities. Of course, the state trial lawyers have their own agenda, right, their own biases. They want those class-action dollars, those legal fees. But they raise a good point about the political climate of your state. Those are just some of my concerns about the statute.

Danielle Citron: You know, we, we get so caught up in this sort of – I always think of it as like procedural hogwash – you know, the, where are the defaults and are we opt-in or opt-out and do we have certain rights?

Anita Allen: Yeah.

Danielle Citron: If you could fix all of this for the Black opticon, would you just move away from procedural protections and towards more substantive ones? I love “Popular Privacy." Um, I'm so convinced about your arguments about children's privacy and why, even if you don't have a taste for privacy, kids, too bad, so sad, was sort of your message, right, um, in “Unpopular Privacy.”

Anita Allen: (laughing) Yes.

Danielle Citron: So if you could have the, you know, Anita your wand is here, right. Let's fix all of this. Let's – even if it's unpopular, even if people don't want privacy. Like, are there a few things that you would say, you know what? We must do this, even if you don't have a taste for it.

Anita Allen: It does seem to me that some of the most fundamental and basic human rights should be inalienable. Selling yourself into slavery is not one of those rights which people should have, even though it could be seen as an exercise of freedom. Trafficking oneself as a sex slave should not be an option on the grounds of liberty or freedom of choice, right? And I think of privacy as more like those deep, fundamental human rights than like optional rights or optional liberties. The set of privacy protections that are in place in a country need to be not so much focused on do they maximize people’s short-term interest in making choices, but rather do they maximize people's long-term and fundamental interest in having a life that has, uh, meaningful opportunities for privacy and private choice, and that protect them from exploitation and harm?

Risa: Yeah.

Anita Allen: I do think that I would want to clean up the whole opt-in/opt-out myth. And instead of, uh, focusing on letting people choose whether or not to have core and basic protections, I would just give everybody those protections. And then if there were some emergency, you know, uh, like a global pandemic, for example, that required public authorities to interfere with, or take away some of those precious liberties on a short-term basis in order to save the life of the planet or save the community, that's a different conversation, right? But on a day-to-day basis, we shouldn't have to be worrying about whether or not every time we log onto a website, we are opting in and opting out in just the right, you know, proportion to protect our interest. It should be built into the system that those people who foster platforms or websites or business opportunities or services should be following some rules that are gonna keep our informational privacy, uh, protected at a, at a reasonable level that we can kind of agree on in a political way. So I think that's sort of how I would think about it. And it won't be exactly the same for everybody. So maybe, you know, for some people, it would be a privacy harm to photograph them wearing a miniskirt. But for another person that would be trivial, nothing at all, but it'd be a big deal to photograph them smoking a joint in their backyard. So what it takes to respect a person's privacy won't be exactly the same in everybody's case, but I do think there are going to be some sort of fundamental privacies, like freedom itself, need to be guarded by rules and principles and guidelines, including legal principles and guidelines. We're not even close to understanding exactly what that right mix is. And I would love for us to have more conversations about what the best mix is as opposed to conversations about how much should the corporate sector be able to, you know, manipulate us, how much should the government be able to manipulate us, and how much should we be able to manipulate one another. Let's just talk about what kinds of core privacies constitute a good life and how can we make sure that everybody has an opportunity for those, subject to some variation based on personality and, and religion and taste and those kinds of things. So that's a kind of a long answer to your question, Danielle.

Danielle Citron: I'm so grateful you did it because I think now my dean understands how I think about privacy. You've shaped how I think about privacy as an inalienable right, something that we all deserve. Everyone who's listening should also read “Uneasy Access,” which is another foundational text that Anita's written. So that wasn't long, that was brilliant.

Anita Allen: One of my, uh, works in progress is a paper in which I try to retell the origin story of privacy from the point of view of African Americans. So, when you retell how we got privacy law from the point of view of African Americans, it doesn't start with Warren and Brandeis in 1890 writing an article for the Harvard Law Review. Instead, it starts with the Constitution itself and how it treated Black people and Native Americans, in terms of their not having the same common law and constitutional rights that we inherited from the English tradition and invented for ourselves as White peoples in the beginning of the country. And then it moves on to those State versus Mann and State versus Rhodes, those North Carolina cases where the court says, 'because of privacy, we have to let the man beat his wife, because of privacy, we have to let the man go free who has unlawfully shot the enslaved person working for him. That privacy was used as a reason to not protect the interests of women and Black people. Warren and Brandeis don't talk about that at all. They talk about the ‘incivilities’ and the ‘effronteries of life’ and not about the ways in which privacy puts people's life and limbs at, at risk in American society.

Danielle Citron: As you might say, It's the wrong kind of privacy.

Anita Allen: Too much of the wrong kind of privacy, right.

[THEME MUSIC CREEPS IN]

Danielle Citron: Anita, it's always such a joy and pleasure to hear you speak and to talk to you about these issues. So thank you so much for coming.

Risa: That was just fabulous.

Anita Allen: Thanks.

[THEME MUSIC IN, THEN UNDER, THEN OUT]

Risa Goluboff: So it seems to me that such an important part of the work that Anita Allen is doing is identifying some of those harms that are so hard to see, and that are harms that don't affect everyone the same way, right? These are harms with at least a disparate impact, if not a disparate intent, right?

Danielle Citron: That's right. And you've got privacy advocacy groups that are somewhat coming to this problem that is, they see themselves as civil rights organizations, as well as civil liberties organizations. But what Anita is doing is demanding that they look at it at the fore. That is, groups like EPIC and the Future of Privacy and EFF, they are increasingly understanding privacy having disproportionate impact on marginalized communities. We've seen that in the last 10 years. But what she's trying to do is almost have them make that at the center stage of their work. And they are, at least EPIC is. I'm chair of the board. Anita was the former chair. This year's focus is on the Black panopticon and intimate privacy, thinking through the disparate impact that it has on women and minorities.

Risa Goluboff: When you say civil liberties, as opposed to civil rights …

Danielle Citron: Right.

Risa Goluboff: Could you just say for our listeners, why that distinction is so important and what it is that Anita Allen is doing to move the conversation from civil liberties to civil rights?

Danielle Citron: Civil liberties we might understand it or think about it as freedom FROM government, and civil rights as a freedom TO entitlements. That is to, to protections and to accommodations and to responsibilities. That everyone has equal opportunities. That, that we all are owed privacy, but everyone enjoys them on equal terms. And so bringing the equality story to the fore, versus the freedom from – that everyone enjoys – FROM government surveillance. So of course, civil rights would involve anti-discrimination commitments, but it brings sort of the equality interests of minorities and women to prominent place, as in a positive entitlement to not just freedom from, but freedom to opportunities.

Risa Goluboff: So it goes back to last year's season, which was all about equity.

Danielle Citron: Yeah.

Risa Goluboff: Ways of enabling people to achieve particular goals rather than merely formal opportunity.

Danielle Citron: Yeah, yeah. Anita has always talked about privacy understood the right way, recognizing that people are gonna use privacy as a weapon. And I've always wanted to say, ‘Forget privacy understood the right way.’ It's privacy as normatively appealing. That is, autonomy-enhancing, equality-reinforcing; it's love-securing. That is, privacy should be up for everybody, right? Rather than like privacy as understood the wrong way. Understood the wrong way – you know what that's called? It's called seclusion, secrets. It's hiding, you know, it's not about privacy. It's about something else. So I don't know. That's just a small thing. I always think every time I see it as privacy understood the wrong way is maybe not about privacy, right?

Risa Goluboff: Well, and Anita talks about, right, the ways in which private spaces historically were used to oppress.

Danielle Citron: Yes!

Risa Goluboff: In the household, what privacy meant was the man could do whatever he wanted.

Danielle Citron: Right.

Risa Goluboff: Slave owner could, could do whatever he wanted to enslaved people, because that was his prerogative over that private space

Danielle Citron: She talks about that very thing in her first book "Uneasy Access.” But what I have tried to add to it is that we've got the privacy conflicts. We haven't seen them fully there. That is, we see the male's privacy interest in closing the windows of the home from inspection by law enforcement. What we fail to see is that the woman has been deprived of her intimate privacy the whole time. That is, she's naked, demanded to be, you know ..

Risa Goluboff: Sexually available.

Danielle Citron: Even though you would say, ‘Oh, that's her domain, the home,’ but no, she's had no privacy there, ever. She's under constant inspection. She's supposed to perform. She has no entitlement to intimate privacy in the home, at least in that 20th-century way.

Risa Goluboff: Historians talk about ‘separate spheres’ in the 19th century for women. And so what you are saying, ‘No, no, that was not  …”

Danielle Citron: They’ve never enjoyed privacy!

Risa Goluboff: Right. Right. So when you talk about defining privacy this way, does that mean in, in a sense that the civil rights conception of privacy has to be, is integral to …

Danielle Citron: Yup.

Risa Goluboff:… the whole notion of privacy, that even dividing them into a civil liberties and a civil rights conception is a false dichotomy from the start?

Danielle Citron: Yes. I think that's right. I, I would bake in equality as inherently part of the privacy story. So let's say you would say, ‘Danielle, okay, in a world in which we've solved every equality problem, there is no discrimination. We are free from bigotry and invidious attitudes. You know, do we still need privacy?’ And I would say, I love, I love that idea. Let's live in that world. Um, but even if we could free ourselves of ALL equality concerns, that everyone got privacy on equal terms no matter what, right, no matter your background, I think we still need privacy for other reasons. Autonomy, dignity, intimacy, love. Unfortunately, we don't live in a world without those invidious attitudes. So I don't wanna say you can never unentangle them, but until we can get rid of invidious attitudes … You know, we often think, ‘Oh, misogyny is so bad in the U.S.’ and, – but it's bad everywhere, unfortunately, I've seen in my own work. Until we can get to a place where we're free of all invidious attitudes, equality and privacy go together.

Risa Goluboff: Would you go a step further? I mean, the historian in me thinks not only about invidious attitudes, but historical discrimination, institutional racism, misogyny. And it's not only, I think, about attitudes, it's about where those attitudes now live …

Danielle Citron: Yeah.

Risa Goluboff:… in institutions that then perpetuate themselves, right?

Danielle Citron: Yes. And those become almost the firmament, right? That even if you could get beyond the specific attitudes, right, and stereotypes, let's vanquish them, you're still gonna have it built into institutions and the way in which power is then diffused, right, in society. I don't think we'll ever move beyond it in that sense, right, because it's baked into architectures of systems and tools and services and it is baked into law. Uh, so we're gonna have to – I think that commitment to privacy must be a commitment to equality too.

Risa Goluboff: Well, I hate to say goodbye, Danielle, but, uh, but this has been such a fascinating conversation and such an important one, and it has been such a pleasure to co-host with you on these episodes.

Danielle Citron: Thank you so much for allowing me to come and co-host and bring some of my favorite people in the world, um, to our podcast. So thank you so much. What fun.

[THEME MUSIC IN, THEN UNDER]

Danielle Citron: That does it for this episode of Common Law. For more information on Anita Allen's work on the Black panopticon, visit our website at Common Law Podcast dot com. There you'll find all of our previous episodes, links to our Twitter feed and more.

Risa Goluboff: In two weeks, co-host Cathy Hwang returns as we talk “odious debt” with UVA Law's Mitu Gulati.

Mitu Gulati: When you make people who got their independence after fighting a bloody war, pay for their freedom, I don't think you need to be all that scholarly to think that this is pretty stinky.

Risa Goluboff: We can't wait to share that with you. I'm Risa Goluboff.

Danielle Citron: And I'm Danielle Citron. Thanks for listening.

[THEME MUSIC UP, THEN UNDER]

Emily Richardson-Lorente: Do you enjoy Common Law? If so, please leave us a review on Apple podcasts, Stitcher, or wherever you listen to the show. That helps other listeners find us. Common Law is a production of the University of Virginia School of Law and is produced by Emily Richardson-Lorente and Mary Wood.

[THEME MUSIC UP, THEN OUT]