Democracy, AI and Privacy
January 25th 2024 2:00-4:00 pm will only be held ONLINE
https://classroom.aau.at/b/mat-6mu-tza
PD Dr. Carsten Ochs
This lecture is organized in collaboration between the Faculty of Social Sciences and the D!ARC
Abstract:
Determining the relationship between democracy, AI and privacy begs the question if contemporary society in general is to be conceptualized in terms of an emerging digital condition: to what extent do digital transformations establish novel modes of sociotechnical structuration, thus bringing about a new type of society proper? Are digital networking, platforms, self-tracking, algorithmization, datafication etc. to be considered as surface effects of society as we know it, or is there structural modification as regards the foundational logic of the constitution of society? Empirical and theoretical answers to these questions vary greatly. Whereas some have suggested that what we call “digitization” in fact induces a new anthropo-logic, a new evolutionary phase in human becoming-with (M. Faßler) or at least novel modes of societal structuration (D. Baecker), others have regarded digitization simply as continuing of genuinely modern structural principles (A. Nassehi).
Having said this, my presentation starts from the assumption that the only way to gain an analytical understanding of the constitution of digital society is to determine in theoretical terms novel structural principles based on thorough empirical research, with the notion of „structural principle“ referring to modes of structuration that pervade societies at large. Thus, is digitization to be understood as a (Durkheimian) “total social fact”, as N. Marres has claimed? Answering in the affirmative presupposes that we identify genuinely digital structural contradictions. In my presentation, I will make two such contradictions a subject of discussion:
- First, there is the contradiction between what I call digital optionality (the increase of options for action brought about by digital infrastructures) on the one hand, and digital predictivity (the narrowing down of options via predictive analysis) on the other. As Subjectification is faced with this contradiction, 21st century privacy is bound to take the form of a right to unpredictability.
- Second, there is a contradiction between the societal disorder caused by Machine Learning-based AI systems structuration of communicative practices (see, e.g., the role of Facebook’s algorithms in January 6 US Capitol attack of 2021) on the one hand; and what I call “hyper-nomy” on the other: the hypertrophic growth of non-negotiable normative ordering mechanisms put into operation by said systems at the same time.
In the last part of my talk, I will speculate about the potential impact that the transformations of AI and privacy have on democracy. As predictivity and hypernomy tend to narrow down the contingency of possible futures by way of sociotechnically fostering the reproduction of the past, a democratic politics of the digital is bound to come to terms with both of these structural principles to safeguard the openness of social futures.