Computing the Law Searching for Justice

From Software Studies
Jump to navigation Jump to search

Computing the Law // Searching for Justice

Computing the law.png

This talk explores two distinct mechanisms that have responded to the injustices of conflict and war in ways that are suggestively computational. The forums of the International Criminal Tribunal with their elaborated “Rules of Procedure and Evidence” have, I argue, transformed the juridical apparatus into a quasi-machinic set of operations that compress the affective realm of experience through the legal strictures of testimony and cross-examination. Stressing an apparent continuity between the law and its computability is not simply to attribute a formal or mathematical analogy to its functions but is a “philosophical latency” and “historical inheritance” [1]. Indeed, the very the origins of certain legal processes inform the language of computation such that the term “execute,” as in “to execute a coding script,” heralds from the fourteenth-century legal reference to carry out or accomplish a course of action: to prosecute, to issue a warrant or to sentence.

Moreover, the protocols that govern what counts as evidence and who counts as a witness are scrupulously attended to, such that the subjective dimensions of testimony and the expressive qualities of material evidence are systematically flattened and disarticulated of all affect as witnesses and exhibits move through the circuitry of the court. By contrast, the model of the Truth and Reconciliation Commission—which has no legal mandate but is in general organised by a quest for justice and ethical demand that the perpetrator of violence account for and admit to wrong-doing—might be characterised as a kind of “incomputable object,” one that operates in what Eyal Weizman names an “excess of calculation.” The distressing and often-times diverse forms of exchange between victims and perpetrators, which includes re-enactments and the performance of cultural rituals, produces a surfeit of information that is not conditioned by legal codes that would render such expressive forms of testimony subservient or inadmissible. Unlike the rule-based logic that organises the Criminal Tribunal to arrive at a verdict—guilty or not-guilty, upon an evaluation of all tendered evidence and testimony—the affective processes of the Truth Commission can’t be fully captured by an instrumental or computational conception of justice as merely on or off, right or wrong. Rather, this is performed through a complex process of speaking to history and listening to stories that had previously fallen upon deaf ears, or were wilfully ignored or violently suppressed. Testimony is given, not in order to mete out punishment or acquire compensation, but as a public claim for recognition of that which took place, whether events occurred during the more recent decades of political repression or over extended periods of colonial rule. Testimony is given so that the survivors of historical injustice can begin to “live” in the present rather than merely endure. Testimony is given so that the conditions for a shared humanity might begin to be assembled.

Computing the Law Searching for Justice.jpeg

Luciana Parisi argues that the behaviour and ontology of algorithmic objects, which arises with interactive and distributive computing, needs to be understood as generative and not simply defined by a finite set of rules [2]. Only when computational thinking is grasped in this manner does it offer a conceptual framework whereby the formal rules and practices that script legal codes could give way to the emergence of new ethical arrangements. For Gilles Deleuze, reflecting upon the French legal tradition, jurisprudence was not the study of the underlying principals of law—its legal source code—but concerned the creation of new laws as well the creation of rights expressed by and through these laws. Jurisprudence, he argued, is “everything that creates problems for the law, that threatens to call what is established by law into question” [3]. Jurisprudence was thus conceived as a domain of thought and practice that was productive of rights and thus also new forms of politics. With this understanding in mind, could the search for justice that impels the activities of a Truth and Reconciliation Commission also be conceptually aligned with a much more radicalised understanding of the algorithmic realm as generative of new configurations of data capable of evolving new modes of behaviour? Or does the realm of ethics remain forever incomputable? Can the search for truth as an experimental form emerge out of an algorithmic-like set of coding practices as Luciana might contend? As I tried to address in my essay “Deadly Algorithms,” which explored the legal implications of remote-controlled drone warfare and decisions to execute that are arrived at through data-aggregation, algorithms are not simply re- ordering the fundamental principles that govern our lives, but are also being tasked with providing alternate ethical arrangements derived out of new modes of reasoning that are increasingly computational. These are some of the fundamental questions I would like to begin to raise in this talk.

--Susan Schuppli, March 2016

[1] As Laura U Marks likewise suggests in her work discussing the affinities between systems of data compression in new media art and the kinds of Islamic world-making practices characteristic of Persian carpets. Laura U. Marks, Enfoldment and Infinity: An Islamic Genealogy of New Media Art (London: MIT Press, 2010). P .5.

[2] Luciana author Parisi, Contagious Architecture: Computation, Aesthetics, and Space (Csmbridge: MIT Press, 2013).

[3] Paul Patton, "Immanence, Transcendence, and the Creation of Rights," (2011). Birkbeck lecture March 7th.

Biography

Susan Schuppli is an artist and researcher based in London. Her research practice examines media artefacts that emerge out of sites of contemporary conflict and state violence to ask questions about the ways in which media are enabling or limiting the possibility of transformative politics. Current work explores the ways in which toxic ecologies from nuclear accidents and oil spills to the dark snow of the arctic are producing an “extreme image” archive of material wrongs.

Creative projects have been exhibited throughout Canada, the US, Europe and Asia. Recent and forthcoming exhibitions include Casino Luxembourg, Extra City Antwerp, Stroom Den Haag, Shanghai Biennale, Charlottenborg, Galerie Wedding, Witte de With, Fundacion Proa and Bildmuseet Sweden. She has published widely within the context of media and politics and is author of the forthcoming book, Material Witness (MIT Press, 2015), which is also the subject of an experimental documentary.

She is Senior Lecturer and Deputy Director of the Centre for Research Architecture, Goldsmiths. From 2011-14 she was Senior Research Fellow on the ERC project Forensic Architecture led by Eyal Weizman (Principal Investigator). Previously she was an Associate Professor in visual/media arts in Canada. Schuppli received her PhD from Goldsmiths and participated in the Whitney Independent Study Program after completing her MFA at the University of California San Diego. For more details, see http://susanschuppli.com/

Suggested readings

Schuppli, Susan. "Deadly Algorithms: Can legal codes hold software accountable for code that kills?" Radical Philosophy 187.Sept/Oct (2014): 2-8. https://www.radicalphilosophy.com/commentary/deadly-algorithms

Schuppli, Susan. "Should Videos of Trees have Standing? An Inquiry into the Legal Rites of Unnatural Objects at the ICTY." A Cultural History of Law in the Modern Age. Eds. Celermajer, Danielle and Richard Sherwin. London: Bloomsbury, (2016): 1-35.

Teubner, Gunther. "Rights of Non-Humans? Electronic Agents and Animals as New Actors in Politics and Law." Journal of Law & Society 33.4 (2006): 497–521.

Supplementary readings

Dewey, John. "The Historic Background of Corporate Legal Personality." Yale Law Journal 35.6 (1926): 655-73.

Johnson, Deborah G., and Keith W. Miller. "Un-Making Artificial Moral Agents." Ethics and Information Technology 10.2-3 (2008): 123-33.

Wein, Leon E. "The Responsibility of Intelligent Artifacts: Toward an Automation Jurisprudence." Harvard Journal of Law & Technology 6 (1992): 51.

Lin, Patrick, George Bekey, and Keith Abney. "Robots in War: Issues of Risk and Ethics." Ethics and Robotics. Eds. Capurro, R. and M. Nagenborg. Heidelberg: AKA Verlag, 2009.

Bechtel, William. "Attributing Responsibility to Computer Systems." Metaphilosophy 16.4 (1985): 296-306.


return to event page