Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Tech Justice and Social Control: Sociotechnical Racial Discrimination Discussion, Schemes and Mind Maps of Technology

Critical Race TheorySocial InformaticsMedia StudiesDigital Humanities

The intersection of technology and social control, focusing on the historical context of racial covenants and their modern-day equivalents in housing discrimination. It also introduces the concept of tech justice and its role in addressing these issues. the importance of understanding the sociotechnical component of social control and the need for a proactive approach to tech justice.

What you will learn

  • What is tech justice and how does it address social control issues?
  • How have racial covenants impacted housing discrimination throughout history?
  • How can engineering students and other STEM fields contribute to the tech justice movement?
  • What are some modern-day examples of housing discrimination using technology?
  • What are some effective ways to encourage people to internalize and act upon the concepts presented in this document?

Typology: Schemes and Mind Maps

2021/2022

Uploaded on 09/27/2022

stefan18
stefan18 🇺🇸

4.2

(34)

53 documents

1 / 21

Toggle sidebar

Related documents


Partial preview of the text

Download Tech Justice and Social Control: Sociotechnical Racial Discrimination Discussion and more Schemes and Mind Maps Technology in PDF only on Docsity! Transcript of “A New Jim Code” featuring Ruha Benjamin and Jasmine McNealy – September 24, 2019 Welcome to the September 24 version of the BKC lunch. Thank you for coming. Just so you know, before we get to the good part, just some housekeeping. First of all, this is being live streamed and recorded. So wave to everybody on Twitter and elsewhere. I just tell you that so you know that you are being recorded and govern yourselves accordingly. If you want to, please feel free to tweet at us questions at BKCHarvard or using the hashtag BKCLunch. Q&A session will be at the end after Dr. Benjamin talks. So now the good stuff, right? Dr. Ruha Benjamin is a scholar of the social dimensions of science, technology, and medicine and associate professor of African-American Studies at Princeton University. She is the founder of the Just Data Lab which focuses on bringing a humanistic approach to data and reimagining and rethinking data for justice. She is the author of two books, People's Science out from Stanford University Press, as well as the new book, Race After Technology, out from Polity, which is published this year. She is also the editor of Captivating Technology which is out from Duke University Press this Year And is available after the talk outside. So please join me in welcoming Dr. Ruha Benjamin. [APPLAUSE] Good afternoon. How are you? I'm thrilled to be here. It's my first time visiting the center. And I'm so excited to be in conversation with my sister colleague Jasmine McNeely who actually read drafts, early drafts of Race After Technology. So I was able to incorporate her and others' insight into that work. So we have a little bit of time, and I know most of it is going to be left for discussion. So I'm just going to jump right in. Please join me in acknowledging the land on which we gather is the traditional and unseated territory of the Massachusett. We acknowledge that academic institutions-- indeed, the nation state itself-- was founded upon and continues to enact exclusions and erasures of indigenous peoples. This acknowledgment demonstrates a commitment to beginning the process of dismantling ongoing legacies of settler colonialism and to recognize the hundreds of indigenous nations who continue to resist, live, uphold their sacred relations across their lands. With that, let me begin with three provocations. First, racism is productive. Not in the sense of being good, but in the literal capacity of racism to produce things of value to some even as it wreaks havoc on others. We are taught to think of racism as an aberration, a glitch, an accident. An isolated incident. a bad apple. In the back woods and outdated. Rather than innovative, systemic, diffuse. An attached incident. The entire orchard. In the ivory tower. Forward-looking. Productive. In sociology, we like to say race is socially constructed. But we often fail to state the corollary that racism constructs. Secondly, I'd like us to think about the way that race and technology shape one another. More and more, people are accustomed to thinking about the ethical and social impact of technologies. But this is only half of the story. Social norms, values, structures all exist prior to any given tech development. So it's not simply the impact of technology but the social inputs that make some inventions appear inevitable and desirable. Which leads to a third provocation that imagination is a contested field of action, not an ephemeral afterthought that we have the luxury to dismiss or romanticize, but a resource, a battleground, an input and output of tech and social order. In fact, we should acknowledge that most people are forced to live inside someone else's imagination. And one of the things we have to come to grips with is how the nightmares that many people are forced to endure are the underside of elite fantasies about efficiency, profit, and control. Racism among other axes of domination helps produce this fragmented imagination. Misery for some and monopoly for others. This means that for those who want to construct a different social reality, one grounded in justice and joy, we can't only critique the underside but we also have to wrestle with the deep investments-- the desire, even-- for social domination. So those are the main takeaways. Let's start with some concrete examples. A relatively new app called Citizen which will send you real-time crime alerts based on a curated selection of 911 calls. It also offers a way for users to report, livestream, and comment on purported crimes via the app. And it also shows you incidents as red dots on a map so you can avoid supposedly dangerous neighborhoods. Now many of you are probably thinking, what could possibly go wrong in the age of Barbecue Becky's calling the police on black people cooking, walking, breathing out of place? It turns out that even a Stanford-educated environmental scientist living in the Bay Area, no less, is an ambassador of the carceral state, calling the police on a cookout at Lake Merritt. It's worth noting also that the app Citizen was originally called the less chill name Vigilante. And in its rebranding, it also moved away from encouraging people to stop crime but rather now simply to avoid it. What's most important to our discussion, I think, is that Citizen and other tech fixes for social problems are not simply about technology's impact on society but also about how social norms and values shaped what tools are imagined necessary in the first place. So how should we understand the duplicity of tech fixes, purported solutions that nevertheless reinforce and even deepen existing hierarchies? In terms of popular discourse, what got me Microsoft employees are opposed to the companies ICE contracts, saying that, quote, "as the people who build the technologies that Microsoft profits from, we refuse to be complicit." This kind of informed refusal is certainly necessary as we build a movement to counter the New Jim Code. But we can't wait for workers' sympathies to sway the industry. Initiatives like Data for Black Lives and the Detroit Community Tech Project offer a more far-reaching approach. The former brings together people working across a number of agencies and organizations in a proactive approach to tech justice, especially at the policy level. And the latter develops and uses tech rooted in community needs, offering support to grassroots networks, doing data justice research, including hosting DiscoTechs which stands for Discovering Technology, which are these multimedia mobile neighborhood workshop fairs that can be adapted in other locales. I'll quickly just mentioned one of the concrete collaborations that's grown out of DATA for Black Lives. A few years ago, several government agencies in St. Paul, Minnesota including the police department and the St. Paul Public School system formed the controversial joint powers agreement called The Innovation Project, giving these agencies broad discretion to collect and share data on young people with the goal of developing predictive tools to identify, quote, "at- risk youth in the city." There was immediate and broad-based backlash from the community with the support of the Data for Black Lives network. And in 2017, a group of over 20 local organizations formed what they called the Stop the Cradle to Prison Algorithm Coalition. Eventually, the city of St. Paul dissolved the agreement in favor of a more community-based approach, which was a huge victory for the activists and community members who'd been fighting these policies for over a year. Another abolitionist approach to the New Jim Code that I'd like to mention is the Our Data Bodies digital defense playbook, which you can download for free online and make a donation to the organization if you're inclined. The playbook contains in-depth guidelines for facilitating workshops and group activities plus tools, tip sheets, reflection pieces, and rich stories crafted from in-depth interviews with communities in Charlotte, Detroit and LA that are dealing with pervasive and punitive data collection and data-driven systems. And the aim here as the organization says is to engender power, not paranoia, when it comes to technology. And although the playbook presents some of the strategies people are using, in the spirit of Douglass' admonition about the upper ground railroad, not everything that the team knows is exposed. Detroit-based digital activist Tawana Petty put it bluntly. "Let me be real. Y'all are getting the digital defense playbook, but we didn't tell you all their strategies and we never will because we want our community members to continue to survive and to thrive. And so the stuff that's keeping them alive we're keeping to ourselves." And finally, close to home, the work of my brilliant colleague at MIT Sasha Costanza-Chock and the dissident Design Justice Network. "Among the guiding principles of this approach is that we prioritize design's impact on the community over the intentions of the designer, and before seeking new design solutions, we look for what is already working at the community level." The fact is data disenfranchisement and domination has always been met with resistance and reimagining in which activists, scholars, and artists have sharpened abolitionist tools that employ data for liberation. From Dubois' data visualizations that sought to counter the racist science of his day to Ida B Wells-Barnett's expert deployment of statistics and the red record, there is a long tradition of challenging and employing data for justice. In that spirit, the late legal and critical race scholar Harvard Professor Derrick A. Bell encouraged a radical assessment of reality through creative methods and racial reversals, insisting that to see things as they really are you must imagine them for what they might be. And so when one of my favorite examples of what we might call a Bellian racial reversal is this parody project that begins by subverting the anti-black logics embedded in new high-tech approaches to crime prevention. Instead of using predictive policing techniques to forecast street crime, the white collar early warning system flips the script by creating a heat map that flags city blocks where financial crimes are likely to occur. This system not only brings the hidden but no less deadly crimes of capitalism into view but includes an app that alerts users when they enter high-risk areas to encourage citizen policing and awareness. Taking it one step further, the development team is working on a facial recognition program to flag individuals who are likely perpetrators, and the training set used to design the algorithm includes the profile photos of 7,000 corporate executives downloaded from LinkedIn. Not surprisingly, the average face of a criminal is white and male. To be sure, creative exercises like this are only comical if we ignore that all of its features are drawn directly from actually existing proposals and practices in the real world, including the use of facial images to predict criminality. And so less fictional, more in terms of getting involved, I would encourage those who are interested to sign up for this webinar which we might think of as a movement to track the trackers. Going back to my collaborators in St. Paul, they're trying to build up a national network of people who want to be more involved in this. And I'll tweet this out later for those who don't have a chance to snap it. So if as I suggested at the start, the carceral imagination captures and contains, then an abolitionist imagination opens up possibilities and pathways, creates new templates, and builds on critical intellectual traditions that have continually developed insights and strategies grounded in justice. May we all find ways to build on this tradition. Thank you for your attention. [APPLAUSE] So obviously, like I said, this is the good stuff right here. Just as moderators [INAUDIBLE]. So new book. Race After Technology. And you mentioned some of the chapters. But could you talk a bit more about the ideas, the concepts that you cover in each of the chapters and the distinctions between them? Yeah. And so part of what's happening with this project is my sociology side and my black studies side are kind of trying to tango and trying to get along. So starting with the black studies, the critical race approach, I just went into it trying to show the productivity of racism and finding racism everywhere. And one of my early readers, one of my colleagues was like, hold up, Ruha. Hold up. You can't just say everything is racism. You need to make some distinctions, create some categories. Put things in a little box the way sociologists like. And so part of what my next step was was trying to think about what are some of the ways to differentiate what I was seeing? The way that coded bias was operating in different systems. And so that led to the chapter breakdowns. And the way you can think about it from Engineered Inequity to Techno-benevolence is going from more obvious forms of the New Jim Code-- things that you kind of see coming where the designers are trying to create hierarchies and inequality-- to going down the line to the point where people are trying to address bias through technology, create various tech fixes, the promise being that this will allow us to sort of bypass these human subjectivities and biases. So going through the chapters is from more obvious to more insidious forms of coded inequity where, beyond the intentions of the designers, you can still reproduce inequity. So it's trying to disentangle the intention of the designer to do good. And for me, that's the more interesting bit is, where the ethos is to solve social problems through technology. Yet unwittingly, one could still manage to reproduce these biases and inequities. And so that's the kind of breakdown that the chapters go and those categories go. And importantly, it's not to make these bright lines between these different lenses but to sort of draw a kind of a spectrum and to see the way that beyond the intentions these effects can take place. Part of the conversation in the book is about not just whether certain technologies should be built, but how we frame them, how we talk about them. You mention Vigilante was rebranded as something else. So can you talk about how the language we use when we talk about technology frames adoption? Yeah, absolutely. I think one of the most surprising things for me when I started the project to the present is how the growing public consciousness around this and a kind of deep skepticism towards technology has grown more palpable. I thought when I started this I wouldn't have to be more on the defensive with these conversations. their voice within academia. I think about medical schools. And I've been brought in to medical schools to think about how race is incorporated in their pedagogy. And students basically saying, White Coats for Black Lives and other organizations saying, you know, we don't feel like we're equipped to go out into the world to be doctors if we don't have x, y, and z set of skills with respect to race and inequity and so on. So it's almost the beneficiaries of this education are saying, you're not you're not training us up right. And it's an interesting reversal in terms of who's taking the lead. But I do think that like White Coats for Black Lives, it would be heartening to see engineering students and other students in STEM fields understand that this is beyond only their universities to think about it at a national and a broader sense of building a movement of students who are calling into question these fields and their training within them. And I also welcome-- I know people often say don't make a comment, ask a question. But I also think this is a discussion. So you should feel free to just reflect out loud and give ideas. Correct me if I said something or elaborate. You don't have to only ask a pointed question. Hi, Dr. Benjamin. So my question is around what do you want this book to do in the world? Because the New Jim Code, one of the things I think about in Michelle Alexander's work is that she clearly set out to start a movement. Is your use of that term, is it that you're looking to do something similar? And then a follow up and a related question is, who's reading this book and what do you hope for them to do? Yeah, that's both great questions. And I see the book and myself as part of an existing movement of people who are calling into question techno-utopianism, as it were. And so I don't see it as kicking off anything. But I do see it as more of a provocation than an end statement, a conclusion of this is what we do. Our marching orders. It's more to sort of provoke. And the main reader I had in mind when I first started were my own students because I had students from across the university coming from humanities, social sciences, black studies, along with my STEM students, my engineering students. And so part of it is to stage a conversation to show that sort of meeting of the minds. Your interests sort of should converge. And so that's what I'm trying to do in the book is to bring together these fields and also my students in conversation and people care about these different things who may not necessarily be talking to one another. And so it is a provocation to jump start conversations so that people have to talk to each other. And then the questions that often come up are similar to how we started, which is students coming out of engineering, computer science, who say, how can I contribute? What should I do differently? What else should I be thinking? And so thinking about how to respond to those kind of questions in a productive way. And then your second part was? The second part was what should we-- so much of my work is in translation. Yes. To do something about these issues. Yeah. I read it earlier. Yes. [INTERPOSING VOICES] You read a rough, rough draft. [LAUGHTER] So to answer that, for me, it's really about these different spheres of influence and activity. I don't want to create a sense of if you're not doing x, y, and z, you're not contributing. It's really think about what your sphere of influence and activity is and how you might raise it. Let's say if you're working in the private sector. I talk a lot with k through 12 educators. And so part of it for me is to seed these ideas before the students get to me. I think it's relevant to how we teach in secondary education. The other audience are the tech insiders who already care about it. And the way that I think about this growing movement, like tech mobility, these are people who took a good sociology class when they were in college. And they're, roar! [LAUGHTER] They're over there at Google making trouble. It's like, yeah. Seeding that, you know? But then the thing is I want the book and my own position in the Academy to lend legitimacy to their concern so that they can be like, well, if you don't believe me, check out this [INAUDIBLE] book. And so part of it is the way that the book can be used to bolster what people care about and were already trying to do. But the book itself becomes a tool to contribute to raising that kind-- and that's been happening. Those kinds of connections have been happening. Hi. Hi. Thank you for being here. I'm having a little fan moment because I follow so much of your work and it always makes me feel so seen. Aw. What's your Name? Kathy Gale. Nice to see you. Oh, I know you. Hi. [LAUGHTER] On the internet. And so one of the things I do is-- this is a comment and then a question-- I help lead and run an organization that's currently funding CS programs to integrate ethics into the curriculum. Some of the folks in this room are from universities who are working on that as well. And as you know, interdisciplinary work can be quite difficult and some of the approaches to ethics can be pair philosophy with computer science and we're done. And someone laughed, but it's definitely-- [INAUDIBLE] is taking off in lots of different parts, including companies are building practices out of philosophers coming to tech companies to do that as well. Because oftentimes, sometimes tech will lump anything that is not engineering in as a whole discipline. And so whether it's a philosopher or a race and gender scholar or a-- You all look alike. That it's all the same. And if we pick one, any one of them, we will build more ethical technology. So that was the comment. The question is, you've been so deeply in this field. What are some of the ways you've been successful in getting into medicine or getting into different groups to really get folks to take this work and figure out how to bring it into their fields? Not just a I heard it at the seminar. Cool, I believe you, but back to my day job. But how have you seen effective ways to really get people to take this and really internalize it and use it? No, that's a really important question. But in some ways, I have the easiest job because in terms of the real work is those who are embedded in an institution that then take the ideas and are trying to institutionalize it or trying to change things. And so as a provocateur coming in and out of spaces, talking with students at a medical school or talking with technologists in a company, my job is the easiest, I think. The hard work is those who have to then grapple with the politics of the place, the sort of intransience. The way that people often give lip service to great ideas, but then when it comes to implementing various things. Hi. Jessica Field. I'm the director of the Cyber Law Clinic here at Berkman Klein and we do work on the domestic dimensions of tech and exclusion, on civil rights and issues like that. But we also do work on the international dimensions of it. And I was wondering if you could talk a little bit about how you see this working out with so many US-headquartered tech companies that are focusing on, perhaps, US dimensions of the problem and on how tech and exclusion plays out than when those policies are applied on the global stage, whether with respect to human rights-- I hear about your work in tech and human rights-- or more generally? Yeah, that's a great question. And that's really an opening to encourage, I think, more work for people to think about how-- so this idea of the New Jim Code, it's evoking a US history of white supremacy by evoking The New Jim Crow and Jim Crow. So it's really situated in thinking about how things play out here. So the challenge is to think about what are the socially salient hierarchies in any region or any country? And then to ask similar questions of power and inequity and how they collude with technology in a particular place to see how it plays out. And so there's things in common, I'm sure we'll find. So I have a few examples in the book drawing from India's national ID system and how that creates all kinds of new caste exclusions. People who are left out. That ID biometric system is the gateway to access almost every public benefit and you don't have one, and how that can be used to create various exclusions. Looking at the way that Muslim populations are treated in China. So I have these very short vignettes that are kind of teasers, really, to say this is not just a US problem. But how it's playing out and how we need to study it I think need to be situated. So I don't offer this as a kind of universal theory to explain everything everywhere. I think that that's actually counterproductive. And it's indicative of a particular way of thinking about knowledge as unless it's a grand theory, it's not very useful. And so I am really encouraging students and other researchers to take the core questions-- moving beyond how to create ethical tech to thinking about power and technology and how these are co-produced and ask this of various places. And I'm so happy to hear that's part of the mandate of the unit that you work. Hi. My name is Zarina Moustafa. Hi. I'm a [INAUDIBLE] here. And I used to work on the Hill. And I was very interested in criminal justice reform. And one of the ways I started looking at the intersection of tech and justice was with the First Step Act and how they had the Risk Needs Assessment Tool in that act. And I was like, this is problematic. And so I guess I'm concerned about, or I'm curious about what you would say is one of the most dire areas to start in. And I know you said people should operate in their spheres. If they're in the private sector, they should do work there. But what do you think is really urgent? And also where do you think we have to give? Like you said, in airports, they're going to start using biometric-- Oh, they have. They have. I just had my face scanned last week, yeah. So should we oppose everything? And I guess, where do we give and where is the areas that we need to focus on? Those are two really hard and important questions. The second one I would say especially is a question that shouldn't be answered by one individual. It's like one of those questions that have to be part of-- that's part of the struggle. Part of the deliberation. Part of the movement-building is to think about how we create that prioritization together, rather than me give a sort of marching orders. But for the first question, I'm really interested in those sites that are offering alternatives. Like, the tech for good ethos, and people producing products and interventions that are various tech fixes for a problem. Whether it's the problem of public safety, or for example, the problem of fair hiring. So there's all of these new software programs that are being rolled out in hundreds and hundreds of companies and organizations to make the process of hiring people more efficient. But also, the promise is more fair because it's not a human interviewer sitting there judging you, but it's a data-driven system that's collecting thousands of data points based on the way that you move your lip or your eye, or you look down or up, and posture. And so the idea is that the more data will lead to more fair outcomes. And so it's these areas where the promise is so high that not only is it going to do what technology is meant to do-- make things easier, faster, more efficient-- but it has a social mandate as well to bypass [INAUDIBLE] or create inequity. And so I I'm not saying that I want everyone to rush to study these tech for good things, but that's what draws my interest precisely because of the promise. But I don't see it as divorced from the more obvious harms. And so we do need people thinking about how tech is being rolled out in the context of the carceral state. There's a whole new wave of initiatives to incorporate e-carceration. Techno- corrections. Ankle monitors. Tracking. And there is for those who are interested a coalition of people who are working on this and working on legislation around it as well. Not just sort of academic critiques of it. And so I want us to spread out. And some of the more obvious things that are coming at us are essentially wrapping these shackles around youths, and tracking them, and calling them, and finding ways to create technical violations to lock people back up. To, across the spectrum, all of the things that are coming off down the pike that are going to save us from ourselves. I think we need attention-- and not strictly a cynical posture. Like, it's definitely going to be that no matter what. That's not what I'm encouraging. But for us to have a sort of sociological skepticism. We've seen this before You've seen things be promised as reform and then they're new ways to contain people and control people. So it's based on research our skepticism is formed and that we can then engage it. But not it's not as simple as like, anti-tech posture, which is the way people like to hear it. They like to hear it as, oh, you're just against everything. And so that's I think important for everyone. Hi. Ronika Lewis. So I run a technology and management consulting firm in Kendall Square. We're part of the Kendall Square association. I'm also in the Harvard Extension School. And a part of the MIT Cell Community. So to kind of address some of the things that I've heard in here about how can we kind of organize and push back. Solve is a community at MIT that focuses on the United Nations Sustainable Development Goals. And I literally just came back from the UN on Sunday. What is the name? Solve? Solve, yeah. So solve.mit.edu. And so one of the things that we definitely focus on is the future of work, or actually, we call it work of the future. We know work is going to be around, so what does that look like? And because there is a lot of human and machine interaction with AI. And I travel a lot. In the airport, they were scanning my face. I just scanned my vitamin bottle before I came and ordered it from Amazon. Just one scan of the picture. There's a lot of concern around it. And there are a lot of people who are mobilizing in ways-- globally as well, too-- to say that we can push back. In our Kendall Square Association 10th anniversary meeting earlier this year, Jeremy Hyman spoke-- and if you were there, tell him he was amazing. But his new book was called New Power. And he talks about the collective power that we all have and the dichotomies between old power and new power and what that looks like. So when you're thinking it's just us against the big corporations, or us against all of the tech bros in Silicon Valley, how do we mobilize against them? So my question to which there is no answer is if you have thoughts about how we get past abstraction to the specificity of the productive racism of US-based tech companies. Because I think a lot of people, certainly not in this room, but a lot of people who are raised in the category of white are going to hear your work and think, oh, that doesn't apply to me because I don't have race. And I just want to use tech for good. And so it's good what you're doing for black people, but we don't need that. So I just wonder how to penetrate that belief in abstraction, because I think it's so core to what you're doing. Thank you. No, thank you. That is a really powerful way to frame the problem. And the thing I would sort of put in the mix there is that in many ways, I see abstraction-- the opposite of abstraction is not only specificity but it's also a kind of everything is everything. It's a way of thinking not just in the abstract, but in an encompassing vision. And that's part of this data drive to collect everything. And there's two chapters in particular that I think start to get at this question in Captivating Technology. One is by Tamara Nopper. And she's looking at financial technologies and how you have FICO on the one hand that's like the epitome of abstraction and reduction. But then the alternatives that are coming down in terms of fintech, they counter that abstraction by saying, we are going to get data on everything and then calculate your risk and character. And so it's a way of thinking-- the alternative to abstraction itself is seeding all kinds of new forms of surveillance where it's not just your economic activity that's being quantified, but it's all of your social connections. Whether someone you know has defaulted on a loan-- that's part of your own risk assessment. And so it's interesting to think about, again, how the alternative to abstraction creates this encompassing vision of how to calculate and how to sort of manage risk that can be even more controlling and oppressive. The other chapter Ron Eglash writes in a more historical vein. And what's interesting-- and I didn't quite get it until I was editing his chapter and reading it. He's talking about how we associate racial science and eugenics with only this idea of sort of robbing people of their individuality and reducing people to categories. Again, this reductive vision. And one of the things he argues and shows is how holism and this idea of a holistic theory itself has been a tool for racist science and oppression. This way of thinking about trying to explain everything. It's not being specific, but it's being general and trying to encompass everything-- how that was also part of the kernel of Nazi science, US racial eugenics, and so on. And so I would encourage those who are interested in this part of the conversation to check out especially those two chapters, because it troubles in some ways our easy categories of the bad stuff that we need to watch out for is only the reductive stuff. Only the FICOs, only the crude racial categories. And I think we also have to be alert to and thinking critically about the stuff that doesn't look like that. That's about all-encompassing. A way of thinking that can be even more, I think, oppressive in the end. I think we're kind of out of time, right? Got one more question? [INAUDIBLE] Yeah. OK, wonderful. Hi, I'm Moman. And also a fellow here. Hi, Moman. And I was hesitating raising my hand because this might be a complicated question. So I think of abstraction, formalism, and modeling as kind of making things that are supposed to apply in a lot of different cases. And a lot of the problems I see come down to every abstraction inevitably failing somewhere, even if it was made for quote, unquote good purposes. And I wonder how much we can design better abstractions, and how much the problem is abstraction itself, and how we can exist as a civilization or how we can rethink that if we're not relying on abstraction at some level. whether that's law, whether that's statistical measurements, whatever happens to be the abstraction. No, I do think that that really dovetails with Professor Daniel's question. And I don't have a pat answer like we do or don't. We can't move with abstraction. But perhaps I'll let that question linger, and those who are looking for a good dissertation topics-- [LAUGHTER] --can write that down. It's a tough one. And I think when you dig deep into the examples and the questions that my work is posing, you do eventually arrive at this crossroads. All right. [APPLAUSE]
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved