Difference between revisions of "Exe0.2 Eric Snodgrass"

From Software Studies
Jump to navigation Jump to search
Line 34: Line 34:
[[File:Scissors.png|240px|thumb|Antoine-Augstin Cournot's 1838 scissor diagram, part of a series of what are thought to be the first recorded uses in economics of geometrically plotted supply and demand curve analysis, image from Thomas Humphrey, "Marshallian Cross Diagrams and Their Uses before Alfred Marshall: The Origins of Supply and Demand Geometry", 1992, p.5]]
[[File:Scissors.png|240px|thumb|Antoine-Augstin Cournot's 1838 scissor diagram, part of a series of what are thought to be the first recorded uses in economics of geometrically plotted supply and demand curve analysis, image from Thomas Humphrey, "Marshallian Cross Diagrams and Their Uses before Alfred Marshall: The Origins of Supply and Demand Geometry", 1992, p.5]]


Invisibility is often discursively and practically implemented in the name of efficiency. Smith and the rising stars of liberal economist thinkers in Babbage's time were notable for their powerful new formulations of economic rationality and self interest, devising what quickly proved powerfully attractive models for both interpreting and acting upon human economic behaviour. Chief among these new models was that of geometrically plotted supply and demand curves, relatively simple mechanisms for determining optimal forms of production according to the ideal price points—key sites for execution in a capitalist economy—that these models revealed. Armed with such powerful algorithmic-style methods for rational determinations of maximal economic “efficiency,” liberal economics as both discipline, instrument and ideology, takes off, providing as it does a relatively uncluttered road map for the emerging capitalist forms of production at the time. As the example of Babbage indicates, and as examples such as Mirzoeff’s (“Control, Compute, Execute: Debt and New Media, The First Two Centuries”) recent reading of debt’s key role in new media’s origins highlight, economics acts as a key and a notably concurrent and ongoing point of reference for computational thinking and practices. Consider, for instance, John von Neumann’s contributions to both fields. Or the economic substrate of venture capital's particular logic of funding that sprouts and orients so many of the silicon valleys(/alleys) in today's digital landscape (while, again, funding none of the crucial interventions and work of the many "venture labourers"—Gina Neff, cited in Nakamura 2015—who help to support, curate and maintain these very landscapes).
While discussions of invisibility and power typically highgliht its ability to cloak the "dirty work" of any particular will towards domination, it is worth remembering that acts of blackboxing and making invisible are often discursively and practically implemented in the name of efficiency. Smith and the rising stars of liberal economist thinkers in Babbage's time were notable for their powerful new formulations of economic rationality and self interest, devising what quickly proved powerfully attractive models for both interpreting and acting upon human economic behaviour. Chief among these new models was that of geometrically plotted supply and demand curves, relatively simple mechanisms for determining optimal forms of production according to the ideal price points—key sites for execution in a capitalist economy—that these models revealed. Armed with such powerful algorithmic-style methods for rational determinations of maximal economic “efficiency,” liberal economics as both discipline, instrument and ideology, takes off, providing as it does a relatively uncluttered road map for the emerging capitalist forms of production at the time. As the example of Babbage indicates, and as examples such as Mirzoeff’s (“Control, Compute, Execute: Debt and New Media, The First Two Centuries”) recent reading of debt’s key role in new media’s origins highlight, economics acts as a key and a notably concurrent and ongoing point of reference for computational thinking and practices. Consider, for instance, John von Neumann’s contributions to both fields. Or the economic substrate of venture capital's particular logic of funding that sprouts and orients so many of the silicon valleys(/alleys) in today's digital landscape (while, again, funding none of the crucial interventions and work of the many "venture labourers"—Gina Neff, cited in Nakamura 2015—who help to support, curate and maintain these very landscapes).


In his book ''An Inquiry into the Nature and Causes of the Wealth of Nations'', Smith (1843, p.184) famously evokes at one (and only one) point an image of the invisible hand: “by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention.” As Foucault (2008) makes clear in his analysis of the origins of various forms of liberal economic thought (particularly those of “ordoliberal” and “neoliberal” forms), this is a blackboxing of economic responsibility and decision making, pronouncing economic causes as both autonomous and unknowable - and thus unregulatable and ungovernable. Furthermore, such an invisible instruction pointer is a handy processing device for paving the way for a kind of automation of the economy in the service of this liberated invisible hand. As in Chun’s examples, it is the blackboxing, the making invisible of the source in the name of a kind of “sorcery” that is key here. As Foucault (2008, p.279-80) highlights, the invisibility of this hand is its key feature, paving the way as it does for a conveniently colour-blind, driverless system in which one can happily push to the side the thorny ethical issues such as systemic bias, human discrimination and the many age old and ongoing feedback loops of the seemingly tightly interwoven cause-effects that arise from the interstitials of such issues.
In his book ''An Inquiry into the Nature and Causes of the Wealth of Nations'', Smith (1843, p.184) famously evokes at one (and only one) point an image of the invisible hand: “by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention.” As Foucault (2008) makes clear in his analysis of the origins of various forms of liberal economic thought (particularly those of “ordoliberal” and “neoliberal” forms), this is a blackboxing of economic responsibility and decision making, pronouncing economic causes as both autonomous and unknowable - and thus unregulatable and ungovernable. Furthermore, such an invisible instruction pointer is a handy processing device for paving the way for a kind of automation of the economy in the service of this liberated invisible hand. As in Chun’s examples, it is the blackboxing, the making invisible of the source in the name of a kind of “sorcery” that is key here. As Foucault (2008, p.279-80) highlights, the invisibility of this hand is its key feature, paving the way as it does for a conveniently colour-blind, driverless system in which one can happily push to the side the thorny ethical issues such as systemic bias, human discrimination and the many age old and ongoing feedback loops of the seemingly tightly interwoven cause-effects that arise from the interstitials of such issues.

Revision as of 15:50, 20 April 2016

//a direct continuation, in draft form, of earlier text for *.exe (ver0.1) gathering in preparation and for discussion at *.exe (ver0.2) meetup

sites of execution: interstice

Execution, in its computational sense, might be described as the carrying out of a set of operative instructions by a machine that in a mode of execution is able to run without any required interventions into the nature of its decidability. From such a perspective, execution is any ongoing event of programmed automation, however minute or extensive this period of execution time (or “runtime”) might be. Automation in this sense is an ongoing process of executability, in which instructions as well as inputs and outputs can be continuously decided and acted upon. Crucially, execution is not merely the computability of something but the actual practice and automated execution of this computability in the world. Each instance of execution bringing the wound up velocities of its logical abstractions into contact with the frictional and situated materials of its actualised encounters.

This points to the pincer-like quality of execution hinted at earlier. The way in which it brings computability and materials for its executed computation into direct and active interaction with one another, creating in the process what Howse identifies as “sites of execution” via the kinds of “agential cuts” that Barad elucidates. In the case of automated execution, the agential cuts involved will include the particular contours of logic and computability as they have been compiled into the executing machine code and its related moving physical parts. And in respect to a site of execution, one can borrow from von Uexküll’s Umwelt and characterise sites as the material points of interaction (or even piercing collisions) of these agential cuts as they inscribe and articulate their perceptual-effectual qualities amidst other active ecologies of execution.

As Barad and Howse’s work is quick to point out, the very notion of isolating certain kinds of “sites” of execution is readily problematiseable. In attempting here to temporarily isolate sites of execution as well as a definition of sorts for execution, one aim here is to interrogate the notion of execution by between levels of abstraction and into their situated forms and enactments. Thus the examples of the tick or Howse’s work are highlighted because of the way in which they foreground particular sites of material execution, with the focus in these initial examples given to a sense of marks on bodies, and the fairly precisely defined points of piercing execution in relation to a body.

But execution in the world is a knotty, complex thing, and there are further productive ways in which one might consider the notion of a site of execution. Foucault’s employment of the concept of “interstice” is one such helpful resource. The term appears in Foucault’s essay and outline of what constitutes a genealogical approach, “Nietzsche, Genealogy, History” (originally published in 1971). For Foucault (at this juncture in his thinking), genealogy should treat the history of any emergent interpretations as a continued interplay and contestation of processes of "descent," "emergence" and "domination," as they alight and intersect from one act of interpretation and resultant manifestation to another. In speaking of emergence, Foucault describes how any study into the emergence of discursive processes should not be that of a speculation on origins, but rather a study of the “interstices” of existing (often confrontational) forces and how these interstices can be seen to provide traction or take on certain generative qualities. Interstitials of emergence are themselves formed from both existing energies of descent (the existing forces that have fed into sites of emergence) and domination (the ability of any existant or emergent discursive force to harden into stabilised, recognisable and thus enforceable forms of expression). Foucault (1984, p.84-5), drawing directly on Friedrich Nietzsche’s genealogical pursuits, describes a site of emergence as a “scene” where competing forces and energies “are displayed superimposed or face-to-face.” The interstice in this instance is a site of confrontation of such forces, it is “nothing but the space that divides them, the void through which they exchange their threatening gestures and speeches. . . it is a ‘non-place,’ a pure distance, which indicates that the adversaries do not belong to a common space.” In this formulation of Foucault’s then, an interstice is a scene, a superimposition and a confrontational void through which compelling but, in one sense or another, competing forms of potential gestures and energies generate certain expressive powers of exchange and action. The emphasis in this conceptual abstraction is not so much on the materiality of an actual site, but it nevertheless points to a sense of a key point of interoperating lines of force, for which the term interstice provides the overarching rubric. Foucault makes reference in this same passage to Nietzsche’s term Entstehungsherd, which it is worth noting is typically translated as “site of emergence”[3].

Of particular interest in contrasting this notion of interstice with that of the direct shifting and launching of computational processes into the skin of a user or the soil of the earth in the examples of Howse, is the way in which Foucault’s characterisation, while containing a similar notion of confrontation and interaction, nevertheless is characterised as a superimposition involving a “pure distance” (as will be returned to later, Howse himself, without any hint of Foucault, nevertheless ends up using the same term of "non-place" as a descriptor of the CPU[4]). In Foucault’s site of the interstice then, the emphasis is shifted from direct contact to this distance between superimposed forces and the way in which the resultant gaps in between these forces can create compelling drives and potentially productive, generative forms of interplay. To couch it in the terms of this essay, the focus in any such encounters in a site of execution then becomes just as much on the uncomputable as it is on the computable, with the interstice as the emergent site and breeding ground of a will to close this very gap.

As a simple example, consider the originary insterstitial gap that Turing’s (1936) machine model opens up with its mandate of discrete, symbolic elements capable of being enumerated and made into effectively calculable algorithms for execution upon and by machines. In the further materialization of Turing’s thesis into actual computing machines, the act of making things discrete, so as to be computable, becomes one of establishing machine-readable cuts: the switchable on and off state elements, or flip-flops executed via logic gates used to store and control data flow. Such flippable states constitute a material basis that allows for the writing and running of the executable binary instructions of machine code upon a computing machine. Although any manifestation of a Turing machine necessarily involves the storage and transcription of such symbols onto a continuous flux of what (in contrasting relation) are described as “analogue” materials, this marked independence of symbol from substrate (an independence strong enough for Turing to anoint it as “universal”), their outright “indifference” (Schrimshaw 2012; Whitelaw 2013) and yet simultaneous reliance upon each other for functioning purposes, is a kind of “pure”, interstitial distance that Foucault’s formulation can be seen to point at. A key emergent “medial appetite” instigated here being the ongoing way in which various extensive and intensive qualities of the digital and analogue (the discrete and the continuous) are always in some respect “out of kilter” (Fuller 2005, p.82) with one another[5]. Out of such interstitials... the emergence of things such as the seemingly pathological drive to have the digital replicate the analogue and overcome the “lossy” gaps in such encounters; the constant pushing towards faster speeds, higher resolutions, “cleaner” signals, “smarter” machines and so on. Such interstitial, protocological sparks are often readily evident in even just the collision course nature of the names we give to these endeavours: “Internet” “of “Things”; “Artificial” “Intelligence”. A particularly productive agential cut then, this incision of the digital and its seismic materialisation in computational form, giving birth as it has to the “manic cutter known as the computer’ (Kittler 2010, p.228).

One might even characterise such a method of divide and conquer as essential to most discursive systems, the method in question working to make entities executable according to their particular logics and delimited needs. In relation to execution and its capacities of automation, the productive insterstitial gap involved instigates a potential drive towards making more things executable according to the logic of the executing process in question, of enclosing more entities and procedures into its discursive powers. This is a will towards what Foucault gives the name of "domination" to, the way in which sites of emergence can be understood to have the ability to sediment into dominant forms of understanding and action. Domination is the consolidating of an emergent will, a species-like hardening of a system of interpretation and a bending of other wills and forces to its own compositional rules of interpretation. An engraving of power into the bodies it would make accountable to it. Such a self-amplifying will to power often gains further traction by convincing all parties involved of its own normative necessity and inevitability, pushing to the side any notions of the contingent accumulation of its situation and reconfigurability of its aggregate parts. In its peak stage, domination is when the executability of an emergent will—computational, cultural, political, etc.—is able to attain a level of everyday, almost automatic execution. Thus the norm critique of Foucault points towards an understanding of execution and executability as a productive ability by a process to make dominant and sustainable its particular discursive practices and requirements in a world of many other potential processes (a hypothetical flexibility, it should be noted, not necessarily matched at the material level).

Here one can turn to the discussions and examples that Wendy Chun (2016) suggests might be seen as paradigmatic of contemporary computational habituations, such as her recent characterisations of “habitual new media” and the way in which such media have a notable quality of driving users into a cycle of “updating to remain the same.” For instance, the current habit of a continually shifting and often powerfully political set of terms of services that are rolled out by many of the currently dominant networked computing platforms to their accepting users. As Chun and others (e.g. Cayley 2013) point out, every such instance updating and agreeing to terms involves a certain abnegation of decision and responsibility. Nevertheless, the computational codes that underwrite more and more of contemporary cultural and political activities of the moment have a particular way of selling themselves as the best means with which to guarantee its subjcts a sustainable living in the digital era (whether what is being sold is a data mining process for being kept safe from terrorism or a particular social feature to be implemented in the name of a better collective experience). Again, a pincer-like discursive binding can be seen to be in play in these impositions, whereby the tools in question are “increasingly privileged. . . because they seem to enforce automatically what they prescribe” (Chun 2011, p.91-2). Chun goes on to point out how such moves can be seen as a promotion of code as logos, “code as source, code as conflated with, and substituting for, action… [producing] (the illusion of) mythical and mystical sovereign subjects who weld together norm with reality, word with action.” (Chun 2011, p.92). As can be seen to be enacted in many of the ongoing crisis-oriented forms of governance today, in which a state of emergency of one kind or another is implied as the natural state of things, the application of computational modes of execution into such situations can be viewed as almost a perfection of the potential violence of the polis, a cutting out of the executive so as to become the executor (Chun 2011, p.101). Or as Geoff Cox (2012, p.100.1) describes it, “code exceeds natural language through its protocological address to humans and machines. It says something and does something at the same time—it symbolizes and enacts the violence on the thing: moreover, it executes it.” The shared and conditional chorus in this ensemble of human and nonhuman agents summed up in a Gilbert & Sullivan lyric: “Defer, Defer, to the Lord High Executioner!”

Such a pointing towards the ways in which automation involves the acceptance and thus removal of a moment of decision, also points to how the initialising of the decision engine of an executable computer program is both the initiating of a decision making process but also a termination, in that the moment a file is run decision making processes at the level of the executing code in question are thus set. Executing decisions become predefined, as do the parameters of the computable inputs and outputs. This is not to bemoan such a situation, but rather to consider this potent quality of computation as it is put into sites of encounter with the discursive entities that are brought within its range. Programmed habituation has demonstrated a range of both exciting and predictable vectors thus far (with Chun's examples highlighting a wide spectrum of its manifestations). Unsurprising, one such vector is the will to mobilise ever further forms of programmed executability and automation. One of programmed execution’s most powerful effects is not only its deferral of responsibility, but the expansion of power that can occur once such a deferral is normalised. Consider the qualities of deferral and sliding expansion that layers of code and their corresponding modes of abstraction promote in a simple progression like the following:

Execute this individual with this weapon.
Execute this individual according to this set of laws.
Execute this group of individuals in accordance with this set of laws.
Execute this program that executes that group of individuals in accordance with this set of laws.

sites of execution: invisible hands

shift.

In histories of computing, the punched cards of Joseph Maria Jacquard’s looms and Charles Babbage and Ada Lovelace’s implementation of a punched card model of programming for Babbage’s Analytical Engine make for popular launching points for discussions on the epistemic and material breakthroughs that led to the general purpose computers of the present. As writers such as Sadie Plant (1995) and Nicholas Mirzoeff (2016) have highlighted, in the midst of his transition from his work on the Difference Engine to that of the Analytical Engine, Baggage undertook an extended study of the workings and effects of automated machines used in various forms of manufacture. The research resulted in the text On the Economy of Machinery and Manufactures, published in 1832 to quite popular acclaim at the time. The text features extended explanations of the mechanical operations of such machines in the factories of Babbage's time and their accompanying forms of specialization, standardisation and systemisation, as reflected both within the factories and also in the economies emerging out of these mechanised factories as a whole. The influence of Adam Smith and nineteenth-centry liberalism is particularly evident, with Babbage emphasising the benefits that might be achieved with greater division of labour: “the most important principle on which the economy of a manufacture depends on is the division of labour” (Babbage, cited in Mirzoeff p.4). As witnessed in one passage describing Gaspard de Prony’s work with turning unemployed servants and wig dressers into “computers” capable of calculating trigonometric tables by means of addition and subtraction, the idea can be seen to take hold on Babbage that such factories and strategically partitioned forms of labour might be understood as schematics and material setups for entirely mechanical forms of computation. Babbage would later look back on these early factories as prototypes for his “thinking machines,” with his naming of the “mill” as the central processing unit for his Analytical Engine figuring as the most obvious token of this influence on his combined computational and material thinking. And all the while, a concurrent partitioning and making invisible of the workforces involved in these developments persists into the present day, whether one is speaking of the above-mentioned servants and wig dressers, Ada Lovelace's work with Babbage, or the many other women, minorities and other all-too-often precariously employed workers involved therein. These bodies and their “nimble fingers” (Nakamura 2014) continue to remain in almost entirely marginalised positions, at best as seeming “footnotes” (Plant 1995, p.63-4) to histories of computing, despite their crucial role to the creation, running and sustaining of the existence of these computational economies, whether at the level of manufacturing (Nakamura 2014), programming (Chun 2015; Plant 1995) or online community management and support (Nakamura 2015).

Antoine-Augstin Cournot's 1838 scissor diagram, part of a series of what are thought to be the first recorded uses in economics of geometrically plotted supply and demand curve analysis, image from Thomas Humphrey, "Marshallian Cross Diagrams and Their Uses before Alfred Marshall: The Origins of Supply and Demand Geometry", 1992, p.5

While discussions of invisibility and power typically highgliht its ability to cloak the "dirty work" of any particular will towards domination, it is worth remembering that acts of blackboxing and making invisible are often discursively and practically implemented in the name of efficiency. Smith and the rising stars of liberal economist thinkers in Babbage's time were notable for their powerful new formulations of economic rationality and self interest, devising what quickly proved powerfully attractive models for both interpreting and acting upon human economic behaviour. Chief among these new models was that of geometrically plotted supply and demand curves, relatively simple mechanisms for determining optimal forms of production according to the ideal price points—key sites for execution in a capitalist economy—that these models revealed. Armed with such powerful algorithmic-style methods for rational determinations of maximal economic “efficiency,” liberal economics as both discipline, instrument and ideology, takes off, providing as it does a relatively uncluttered road map for the emerging capitalist forms of production at the time. As the example of Babbage indicates, and as examples such as Mirzoeff’s (“Control, Compute, Execute: Debt and New Media, The First Two Centuries”) recent reading of debt’s key role in new media’s origins highlight, economics acts as a key and a notably concurrent and ongoing point of reference for computational thinking and practices. Consider, for instance, John von Neumann’s contributions to both fields. Or the economic substrate of venture capital's particular logic of funding that sprouts and orients so many of the silicon valleys(/alleys) in today's digital landscape (while, again, funding none of the crucial interventions and work of the many "venture labourers"—Gina Neff, cited in Nakamura 2015—who help to support, curate and maintain these very landscapes).

In his book An Inquiry into the Nature and Causes of the Wealth of Nations, Smith (1843, p.184) famously evokes at one (and only one) point an image of the invisible hand: “by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention.” As Foucault (2008) makes clear in his analysis of the origins of various forms of liberal economic thought (particularly those of “ordoliberal” and “neoliberal” forms), this is a blackboxing of economic responsibility and decision making, pronouncing economic causes as both autonomous and unknowable - and thus unregulatable and ungovernable. Furthermore, such an invisible instruction pointer is a handy processing device for paving the way for a kind of automation of the economy in the service of this liberated invisible hand. As in Chun’s examples, it is the blackboxing, the making invisible of the source in the name of a kind of “sorcery” that is key here. As Foucault (2008, p.279-80) highlights, the invisibility of this hand is its key feature, paving the way as it does for a conveniently colour-blind, driverless system in which one can happily push to the side the thorny ethical issues such as systemic bias, human discrimination and the many age old and ongoing feedback loops of the seemingly tightly interwoven cause-effects that arise from the interstitials of such issues.

One compelling example of shifting the site of execution so as to specifically interrogate execution as it transpires across economics and its current computational manifestation in the form of databases, can be found in the artist duo YoHa's (Graham Harwood and Matsuko Yokokoji) work in Invisible Airs, featuring a series of “contraptions”[6] aimed at highlighting and performing the largely opaque contours of Bristol City Council’s expenditures, as laid out in the Council's newly minted open data initiatives of the time. Acknowledging the ongoing historical and social formations of gaps between knowledge and power—in this case the “gap between the wider public's perception of data” (as well as a contemporary “form of indifference toward the expectations of this kind of open data initiative”)—the pieces in this project aim to create “A partial remedy for this indifference” through “making data more vital” while also “taking a more critical view of transparency itself” (Harwood 2015, p.93).

As with almost any of YoHa’s work, the project resulted in range of different outputs and forms of engagement. For a helpful overview, see Alistair Oldham's excellent short documentary on the project: https://vimeo.com/36567631. One core interrogation of the dataset was that of expenditures of over £500...... ((almost finished here... another 1500 or so words, mostly on Invisible Airs, to be inserted here in advance of discussion at Malmö event))

YoHa, Invisible Airs, "pneumatic contraption" for "open data book stabbing"

sites of execution: interface

Like.png


notes

[3] Entstehungsherd has also been translated as “breeding ground”, hinting both towards the medical sense of Herd as the “seat” of “focus” of a disease and also to its further designation as a site of any biological activity that makes something emerge (Emden 2014, p.138). This is part of a general transition in Nietzsche’s thinking, one that shows up most clearly in his Genealogy of Morals, whereby he aims to usurp notions around *Ursprung* (origin) with that of Entstehungsherd. Foucault’s description of the interstice can itself be further compared with his well known formulation of the appartus (dispositif) and its “grids of intelligibility,” in which systemic connections between heterogenous ensembles of elements are made productive. See Matteo Pasquinelli’s (2015) helpful commentary, “What an Apparatus is Not: On the Archeology of the Norm in Foucault, Canguilhem, and Goldstein, for a genealogy of Foucault’s employment of dispositif, one that like von Uexküll’s Umwelt sees a tracing back of continental philosophy to an earlier breeding ground of biophilosophical ruminations.

[4] Howse (2015): "But what is exactly this hidden place of the now, where symbolic orders, where language becomes material change at a quantum level? Where words are subjected to literal and not literary un-angelled noise. . . This non-place is the CPU or Central Processing Unit, anchoring any technology, AKA. the Dark Interpreter".

[5] A little note on the quantum bit in quantum computing. Is this a shift from dialectics to "trialectics" (Asger Jorn) or to the "sheer hardware" of Kittler...

[6] A term used by YoHa to signal "a domain where the technical overlaps with the imaginary. . . a playful assemblage that privileges speculation over utility." A contraption in YoHa's adoption of the term is also any device notable for its highlighting of "the unstable state before invention becomes normalised," one whose "inherently unstable refusal of utility [emphasises] the forces in the machine that could break it" (Harwood 2015, p.73). Matsuko Yokokoji (in Conversation between Matsuko Yokokoji, Graham Harwood, Jean Demars during the construction of Coal Fired Computers, 2009): "Contraption is not kinetic art, not Jean Tinguely. It's closer to Moholy-Nagy's light-space-modulator but only after it has gone through the filter of Nagy's flight from the Nazis." Finally, through their very non-discursive, irrational qualities, contraptions also have a way of materially shifting the needle of interpretation, highlighting the violence of discourse itself via their own "unsafe" methods. Thus, "A peddle-driven aeroplane is a contraption while a nuclear submarine is seen as some how rational," the contraption casting an oblique light on the discursive machines it plays off of, creating "an unsafe space in which the non-discursive can mix freely with unhinged imaginings. It is a place where limitations of knowledge and discipline are curiously redundant and become a basic method of enquiry" (Harwood 2015, p.73-4).

references

Cayley, John. “Terms of Reference & Vectoralist Transgressions: Situating Certain Literary Transactions over Networked Services.” Amodern 2: Network Archaeology. 2013. Web. http://amodern.net/article/terms-of-reference-vectoralist-transgressions/

Chun Wendy Hui Kyong. "On Software, or the Persistence of Visual Knowledge No Access." Grey Room, Winter 2005, No. 18: 26–51. Print.

Chun, Wendy Hui Kyong. “Crisis, Crisis, Crisis, or Sovereignty and Networks”. Theory, Culture & Society, Vol. 28(6):91-112, Singapore: Sage, 2011. Print.

Chun, Wendy Hui Kyong. Updating to Remain the Same: Habitual New Media. Cambridge, Massachussetts: The MIT Press, 2016. Print.

Cox, Geoff. Speaking Code: Coding as Aesthetic and Political Expression. Cambridge, Massachussetts: The MIT Press, 2012.

Emden, Christian J. Nietzsche's Naturalism: Philosophy and the Life Sciences in the Nineteenth Century. Cambridge University Press, 2014. Print.

Foucault, Michel. The Foucault Reader. Ed. Paul Rabinow. New York: Pantheon, 1984. Print.

Foucault, Michel. The Birth of Biopolitics: Lectures at the Collège de France 1978–1979. Trans. Graham Burchell. New York: Palgrave Macmillan, 2008.

Fuller, Matthew. Media Ecologies: Materialist Energies in Art and Technoculture. Cambridge, Massachusetts: The MIT Press, 2005. Print.

Harwood, Graham. Database Machinery as Cultural Object, Art as Enquiry. Doctoral dissertation. Sunderland: University of Sunderland, 2015. Print.

Howse, Martin. “Dark Interpreter – Provide by Arts for the hardnesse of Nature.” Occulto Magazine, Issue δ, 2015.

Kittler, Friedrich. Optical Media: Berlin Lectures 1999. Tr. Anthony Enns. Cambridge, England: Polity Press, 2010.

Mirzoeff, Nicholas. “Control, Compute, Execute: Debt and New Media, The first Two Centuries.” After Occupy. Web. http://www.nicholasmirzoeff.com/2014/wp-content/uploads/2015/11/Control-Compute-Execute_Mirzoeff.pdf

Nakamura, Lisa. “Indigenous Circuits: Navajo Women and the Racialization of Early Electronics Manufacture.” American Quarterly, 66:4, December 2014, 919-941. https://lnakamur.files.wordpress.com/2011/01/indigenous-circuits-nakamura-aq.pdf

Nakamura, Lisa. “The Unwanted Labour of Social Media: Women of Color Call Out Culture as Venture Community Management.” New Formations: a journal of culture, theory, politics, 106-112, 2015. https://lnakamur.files.wordpress.com/2011/01/unwanted-labor-of-social-media-nakamura1.pdf

Pasquinelli, Matteo. "What an Apparatus is Not: On the Archeology of the Norm in Foucault, Canguilhem, and Goldstein." Parrhesia, n. 22, May 2015, pp. 79-89. Web. http://www.parrhesiajournal.org/parrhesia22/parrhesia22_pasquinelli.pdf

Plant, Sadie. ”The Future Looms: Weaving Women and Cybernetics.” In Mike Featherstone and Roger Burrows (eds.), Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment. London: Sage Publications, 1995.

Schrimshaw, William Christopher. “Undermining Media.” artnodes, No.12: Materiality, 2012. Web. http://artnodes.uoc.edu/index.php/artnodes/article/view/n12-schrimshaw

Smith, Adam. An Inquiry into the Nature and Causes of the Wealth of Nations. Edinburgh: Thomas Nelson, 1843 [originally published 1776].

Turing, Alan M. ”On Computable Numbers, with an application to the Entscheidungsproblem.” Proceedings of the London Mathematical Society, Second Series, V. 42, 1936, p. 249. Print.

Whitelaw, Mitchell. “Sheer Hardware: Material Computing in the Work of Martin Howse and Ralf Baecker.” Scan: Journal of Media Arts Culture, Vol.10 No.2, 2013. Web. http://scan.net.au/scn/journal/vol10number2/Mitchell-Whitelaw.html