Difference between revisions of "Exe0.2 Eric Snodgrass"
(20 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
//a direct continuation, in draft form, of [[Exe0.1_Eric_Snodgrass|earlier text for *.exe (ver0.1)]] | //a direct continuation, in draft form, of ''[[Exe0.1_Eric_Snodgrass|earlier text for *.exe (ver0.1)]]'' for discussion at ''[[*.exe (ver0.2)|*.exe (ver0.2)]]'' meetup. also: ''[http://softwarestudies.projects.cavi.au.dk/index.php/File:7-exhibition9.JPG S.T.A.B.B.Y.]'' | ||
'''sites of execution: interstice''' | '''sites of execution: interstice''' | ||
Line 6: | Line 6: | ||
:: —Graham Harwood ("Pneumatic Soiree" presentation, part of YoHa's ''Invisible Airs'', 2011) | :: —Graham Harwood ("Pneumatic Soiree" presentation, part of YoHa's ''Invisible Airs'', 2011) | ||
Execution in the world is a knotty, complex thing, and there are further | Execution in the world is a knotty, complex thing, and there are further ways that the notion of a site of execution might be productively considered. Foucault’s employment of the concept of “interstice” is one such helpful resource. The term appears in Foucault’s outline of what constitutes a genealogical approach, “Nietzsche, Genealogy, History.” For Foucault (at this juncture in his thinking), genealogy should treat the history of any emergent interpretations as a continued interplay and contestation of processes of "descent," "emergence" and "domination," as they alight and intersect from one act of interpretation and resultant manifestation to another. In speaking of emergence, Foucault describes how any study into the emergence of discursive forces should not be that of a speculation on origins, but rather a study of the “interstices” of existing, often confrontational, forces and how these interstices can be seen to provide traction or take on certain generative qualities. Interstitials of emergence are themselves formed from both existing energies of "descent" (any enduring forces that can be seen to fed into sites of emergence) and "domination" (the ability of any existing or emergent discursive force to harden into stabilised, recognisable and thus enforceable forms of expression). Foucault (1984, p.84-5), drawing directly on Friedrich Nietzsche’s genealogical pursuits[3], describes a site of emergence as a “scene” where competing forces and energies “are displayed superimposed or face-to-face.” The interstice in this instance is a site of confrontation of such forces, it is “nothing but the space that divides them, the void through which they exchange their threatening gestures and speeches. . . it is a ‘non-place,’ a pure distance, which indicates that the adversaries do not belong to a common space.” In this formulation then, an interstice is a scene, a superimposition and a confrontational void through which compelling but, in one sense or another, competing forms of potential gestures and energies generate certain expressive powers of exchange and action. The emphasis in this conceptual abstraction is not so much on the materiality of an actual site, but it nevertheless points to a sense of a key point of interoperating lines of force, for which the term interstice provides the overarching rubric. | ||
Of particular interest in contrasting this notion of interstice with that of the direct shifting and launching of computational processes into the skin of a user or the soil of the earth in the examples of Howse, is the way in which Foucault’s characterisation, while containing a similar notion of confrontation and interaction, nevertheless is characterised as a superimposition involving a “pure distance” (as will be returned to later, Howse himself, without any hint of Foucault, nevertheless ends up using the same term of "non-place" as a descriptor of the CPU[4]). In Foucault’s site of the interstice then, the emphasis is shifted from direct contact to this distance between superimposed forces and the way in which the resultant demarcated gaps in between these forces can create compelling drives and potentially productive, generative forms of interplay. To couch it in the terms of this essay, the focus in any such encounters in a site of execution then becomes just as much on the uncomputable as it is on the computable, with the interstice as the emergent site and breeding ground of a will to interrogate or close this very gap. | Of particular interest in contrasting this notion of interstice with that of the direct shifting and launching of computational processes into the skin of a user or the soil of the earth in the examples of Howse, is the way in which Foucault’s characterisation, while containing a similar notion of confrontation and interaction, nevertheless is characterised as a superimposition involving a “pure distance” (as will be returned to later, Howse himself, without any hint of Foucault, nevertheless ends up using the same term of "non-place" as a descriptor of the CPU[4]). In Foucault’s site of the interstice then, the emphasis is shifted from direct contact to this distance between superimposed forces and the way in which the resultant demarcated gaps in between these forces can create compelling drives and potentially productive, generative forms of interplay. To couch it in the terms of this essay, the focus in any such encounters in a site of execution then becomes just as much on the uncomputable as it is on the computable, with the interstice as the emergent site and breeding ground of a will to interrogate or close this very gap. | ||
As a simple example, consider the originary insterstitial gap that Turing’s ( | As a simple example, consider the originary insterstitial gap that Turing’s (1937) machine model opens up, with its mandate of discrete, symbolic elements capable of being enumerated and made into effectively calculable algorithms for execution upon and by machines. In the further materialization of Turing’s thesis into actual computing machines, the act of making things discrete, so as to be computable, becomes one of establishing machine-readable cuts: the switchable on and off state elements, or flip-flops executed via logic gates used to store and control data flow. Such flippable states constitute a material basis that allows for the writing and running of the executable binary instructions of machine code upon a computing machine, with the standard process for preparing and executing so-called "source" code involving a translation into machine executable code being via a compiler, interpreter or assembler (or some amalgamation of properties of each of these approaches). Although any manifestation of a Turing machine necessarily involves the storage and transcription of such discrete symbols onto a continuous flux of what (in contrasting relation) are described as “analogue” materials, this marked independence of symbol from substrate (an independence strong enough for Turing to anoint it as “universal”), their outright “indifference” (Schrimshaw 2012; Whitelaw 2013) and yet simultaneous reliance upon each other for functioning purposes, is a kind of “pure”, interstitial distance that Foucault’s formulation can be seen to point at. A key emergent “medial appetite” instigated here being the ongoing way in which various extensive and intensive qualities of the digital and analogue (the discrete and the continuous) are always in some respect “out of kilter” (Fuller 2005, p.82) with one another. Out of such interstitials, the emergence of things such as the seemingly pathological drive to have the digital replicate the analogue and overcome the “lossy” gaps therein, the constant pushing towards faster speeds, higher resolutions, “cleaner” signals, “smarter” machines and so on. Such interstitial, protocological sparks are often readily evident in even just the collision course nature of the names we give to these endeavours: “Internet” of “Things”; “Artificial” “Intelligence”; "Virtual" "Reality". A particularly productive agential cut then, this incision of the digital and its seismic materialisation in computational form, giving birth as it has to the “manic cutter known as the computer" (Kittler 2010, p.228). | ||
Such a method of divide and conquer can be understood as essential to most discursive systems, the method in question working to make entities executable according to their particular logics and delimited needs. In relation to execution and its capacities of automation, the productive insterstitial gap involved instigates a potential drive towards making more things executable according to the logic of the executing process in question, of enclosing more entities and procedures into its discursive powers. This is a will towards what Foucault gives the name of | Such a method of divide and conquer can be understood as essential to most discursive systems, the method in question working to make entities executable according to their particular logics and delimited needs. In relation to execution and its capacities of automation, the productive insterstitial gap involved instigates a potential drive towards making more things executable according to the logic of the executing process in question, of enclosing more entities and procedures into its discursive powers. This is a will towards what Foucault gives the name of domination to, the way in which sites of emergence can be understood to have the ability to sediment into dominant forms of understanding and expression. Domination is the consolidating of an emergent will, a species-like hardening of a system of interpretation and a bending of other wills and forces to its own compositional orientations. An engraving of power into the bodies it would make accountable to it. Such a self-amplifying will to power often gains further traction by convincing all parties involved of its own normative necessity and inevitability, pushing to the side any notions of the contingent accumulation of its situation and reconfigurability of its aggregate parts. In its peak stage, domination is when the executability of an emergent will—computational, cultural, political, etc.—is able to attain a level of everyday, almost automatic execution. Thus the norm critique of Foucault points towards an understanding of execution and executability as a productive ability by a process to make dominant and sustainable its particular discursive practices and requirements in a world of many other potential processes. And yet all the while the frictions and resistance of the limit, of bodies, time, and space, as well as the "vibrant" (Bennett 2010) qualities of matter itself (a material agency which Foucault could be said to be less attuned to in his thinking). Turing's computing machine, as sketched according to its logical capacities, may have been granted "an infinite memory capacity obtained in the form of an infinite tape..." (Turing 1944, p.3), but execution in the world is rather more bounded and situated. | ||
Wendy Chun's work provides numerous examples of what might be seen as peak stage and dominant processes of computationally-infused practices of the present, such as in her recent characterisations of “habitual new media” and the way in which such media have a notable quality of driving users into a cycle of “updating to remain the same” (Chun 2016). For instance, the currently accepted habit of a continually shifting and often powerfully political set of terms of services that are rolled out by many of the currently dominant networked computing platforms to their accepting users. As Chun and others (e.g. Cayley 2013) point out, every such instance of updating and agreeing to terms involves a certain abnegation of decision and responsibility. Still, the computational codes that underwrite more and more of contemporary cultural and political activities of the moment have a particular way of selling this abnegation as the best means with which to guarantee its subjects a sustainable living in the digital era (whether what is being sold is a data mining process for being kept safe from terrorism or a particular social feature to be implemented in the name of a better collective experience). Again, a pincer-like, discursive binding can be seen to be in play in these impositions, whereby the tools in question are “increasingly privileged. . . because they seem to enforce automatically what they prescribe” (Chun 2011, p.91-2). Chun goes on to point out how such moves can be seen as a promotion of code as ''logos'', “code as source, code as conflated with, and substituting for, action… [producing] (the illusion of) mythical and mystical sovereign subjects who weld together norm with reality, word with action.” (Chun 2011, p.92). As can be seen to be enacted in many of the ongoing crisis-oriented forms of governance today, in which a state of emergency of one kind or another is implied as the natural state of things, the application of computational modes of execution into such situations can be viewed as almost a perfection of the potential violence of the ''polis'', a cutting out of the executive so as to become the executor (Chun 2011, p.101). The conditional, protological command summed up in a Gilbert & Sullivan lyric: “Defer, Defer, to the Lord High Executioner!” | Wendy Chun's work provides numerous examples of what might be seen as peak stage and dominant processes of computationally-infused practices of the present, such as in her recent characterisations of “habitual new media” and the way in which such media have a notable quality of driving users into a cycle of “updating to remain the same” (Chun 2016). For instance, the currently accepted habit of a continually shifting and often powerfully political set of terms of services that are rolled out by many of the currently dominant networked computing platforms to their accepting users. As Chun and others (e.g. Cayley 2013) point out, every such instance of updating and agreeing to terms involves a certain abnegation of decision and responsibility. Still, the computational codes that underwrite more and more of contemporary cultural and political activities of the moment have a particular way of selling this abnegation as the best means with which to guarantee its subjects a sustainable living in the digital era (whether what is being sold is a data mining process for being kept safe from terrorism or a particular social feature to be implemented in the name of a better collective experience). Again, a pincer-like, discursive binding can be seen to be in play in these impositions, whereby the tools in question are “increasingly privileged. . . because they seem to enforce automatically what they prescribe” (Chun 2011, p.91-2). Chun goes on to point out how such moves can be seen as a promotion of code as ''logos'', “code as source, code as conflated with, and substituting for, action… [producing] (the illusion of) mythical and mystical sovereign subjects who weld together norm with reality, word with action.” (Chun 2011, p.92). As can be seen to be enacted in many of the ongoing crisis-oriented forms of governance today, in which a state of emergency of one kind or another is implied as the natural state of things, the application of computational modes of execution into such situations can be viewed as almost a perfection of the potential violence of the ''polis'', a cutting out of the executive so as to become the executor (Chun 2011, p.101). The conditional, protological command summed up in a Gilbert & Sullivan lyric: “Defer, Defer, to the Lord High Executioner!” | ||
Line 22: | Line 22: | ||
shift. | shift. | ||
In histories of computing, the punched cards and partly automated execution of Joseph Maria Jacquard’s looms and Charles Babbage and Ada Lovelace’s subsequent implementation of a punched card model of programming for Babbage’s Analytical Engine make for popular launching points for discussions on the epistemic and material breakthroughs that led to the general purpose computers of the present. As writers such as Sadie Plant (1995) and Nicholas Mirzoeff (2016) have highlighted, in the midst of his transition from his work on the Difference Engine to that of the Analytical Engine, Baggage undertook an extended study of the workings and effects of automated machines used in various forms of manufacture. The research resulted in the text ''On the Economy of Machinery and Manufactures'', published in 1832 to relatively popular acclaim at the time. The text features extended explanations of the mechanical operations of such machines in the factories of Babbage's time and their accompanying forms of specialization, standardisation and systemisation, as reflected both within the factories and also in the economies emerging out of these mechanised factories as a whole. The influence of Adam Smith and emerging thoughts around economic models of efficiency is particularly evident, with Babbage emphasising the benefits that might be achieved with greater division of labour. Babbage (cited in Mirzoeff p.4): “the most important principle on which the economy of a manufacture depends on is the ''division of labour''.” As witnessed in a passage describing Gaspard de Prony’s work with turning unemployed servants and wig dressers into “computers” capable of calculating trigonometric tables by means of addition and subtraction, the idea can be seen to take hold on Babbage that such factories and strategically partitioned forms of labour might be understood as schematics and material setups for entirely mechanical forms of computation. Babbage would later look back on these early factories as prototypes for his “thinking machines,” with his naming of the “mill” as the central processing unit for his Analytical Engine figuring as the most obvious token of this influence on his combined computational and material thinking. But just as Babbage's model for computing proved prototypical of those that were to come, so too did this partitioning and making invisible of the workforces involved in these developments persist into the present, whether one is speaking of the above-mentioned servants and wig dressers, Ada Lovelace's work with Babbage, or the many other women, minorities and other all-too-often precariously employed workers involved therein. These obscured bodies, their marginalised, “nimble fingers” (Nakamura 2014), continue remain at best as “footnotes” (Plant 1995, p.63-4) in histories and practices of computing, despite their crucial role in the creation, running and sustaining of the existence of computational economies, whether at the level of manufacturing (Nakamura 2014), programming (Chun 2015; Plant 1995) or online community management and support (Nakamura 2015). | In histories of computing, the punched cards and partly automated execution of Joseph Maria Jacquard’s looms and Charles Babbage and Ada Lovelace’s subsequent implementation of a punched card model of programming for Babbage’s Analytical Engine make for popular launching points for discussions on the epistemic and material breakthroughs that led to the general purpose computers of the present. As writers such as Sadie Plant (1995) and Nicholas Mirzoeff (2016) have highlighted, in the midst of his transition from his work on the Difference Engine to that of the Analytical Engine, Baggage undertook an extended study of the workings and effects of automated machines used in various forms of manufacture. The research resulted in the text ''On the Economy of Machinery and Manufactures'', published in 1832 to relatively popular acclaim at the time. The text features extended explanations of the mechanical operations of such machines in the factories of Babbage's time and their accompanying forms of specialization, standardisation and systemisation, as reflected both within the factories and also in the economies emerging out of these mechanised factories as a whole. The influence of Adam Smith and emerging thoughts around economic models of efficiency is particularly evident, with Babbage emphasising the benefits that might be achieved with greater division of labour. Babbage (cited in Mirzoeff p.4): “the most important principle on which the economy of a manufacture depends on is the ''division of labour''.” As witnessed in a passage describing Gaspard de Prony’s work with turning unemployed servants and wig dressers into “computers” capable of calculating trigonometric tables by means of addition and subtraction, the idea can be seen to take hold on Babbage that such factories and strategically partitioned forms of labour might be understood as schematics and material setups for entirely mechanical forms of computation. Babbage would later look back on these early factories as prototypes for his “thinking machines,” with his naming of the “mill” as the central processing unit for his Analytical Engine figuring as the most obvious token of this influence on his combined computational and material thinking. But just as Babbage's model for computing proved prototypical of those that were to come, so too did this partitioning and making invisible of the workforces involved in these developments persist into the present, whether one is speaking of the above-mentioned servants and wig dressers, Ada Lovelace's work with Babbage, or the many other women, minorities and other all-too-often precariously employed workers involved therein. These obscured bodies, their marginalised, “nimble fingers” (Nakamura 2014), continue to remain at best as “footnotes” (Plant 1995, p.63-4) in histories and practices of computing, despite their crucial role in the creation, running and sustaining of the existence of computational economies, whether at the level of manufacturing (Nakamura 2014), programming (Chun 2015; Plant 1995) or online community management and support (Nakamura 2015). | ||
[[File:Scissors.png|240px|thumb|Antoine-Augstin Cournot's 1838 scissor diagram, part of a series of what are thought to be the first recorded uses in economics of geometrically plotted supply and demand curve analysis, image from Thomas Humphrey, "Marshallian Cross Diagrams and Their Uses before Alfred Marshall: The Origins of Supply and Demand Geometry", 1992, p.5]] | [[File:Scissors.png|240px|thumb|Antoine-Augstin Cournot's 1838 scissor diagram, part of a series of what are thought to be the first recorded uses in economics of geometrically plotted supply and demand curve analysis, image from Thomas Humphrey, "Marshallian Cross Diagrams and Their Uses before Alfred Marshall: The Origins of Supply and Demand Geometry", 1992, p.5]] | ||
While discussions of invisibility and power typically highlight its ability to cloak the "dirty work" of any particular will towards domination, it is worth remembering that acts of blackboxing and making invisible are often discursively and practically implemented in the name of efficiency. Adam Smith and the rising stars of liberal economist thinkers in Babbage's time were notable for their powerful new formulations of economic rationality and self interest, devising what quickly proved powerfully attractive models for both the operational manufacture of goods and the interpretation of the flow of such goods within the economy. Of note among these developing models was that of geometrically plotted supply and demand curves, relatively simple mechanisms for determining optimal forms of production according to the ideal price points (key sites for execution in a capitalist economy) that these models revealed. Armed with such powerful algorithmic-style methods for rational determinations of maximal economic “efficiency,” liberal economics as both discipline, instrument and ideology, takes off, providing as it does a relatively uncluttered road map for the emerging capitalist forms of production at the time. As the example of Babbage indicates, and as examples such as Mirzoeff’s (“Control, Compute, Execute: Debt and New Media, The First Two Centuries”) recent reading of debt’s key role in new media’s origins highlight, economics (in both its rational interpretive mode and material practice) acts as a key and notably concurrent point of reference for computational thinking and practices. Consider, for instance, John von Neumann’s contributions to both fields. Or the economic substrate of venture capital's particular logic of funding that sprouts and orients so many of the silicon valleys(/alleys) in today's digital ecologies (while, again, funding none of the crucial interventions and work of the many "venture labourers"—Gina Neff, cited in Nakamura 2015—who help to support, curate and maintain these very ecologies). | While discussions of invisibility and power typically highlight its ability to cloak the "dirty work" of any particular will towards domination, it is worth remembering that acts of blackboxing and making invisible are often discursively and practically implemented in the name of efficiency. Adam Smith and the rising stars of liberal economist thinkers in Babbage's time were notable for their powerful new formulations of economic rationality and self interest, devising what quickly proved powerfully attractive models for both the operational manufacture of goods and the interpretation of the flow of such goods within the economy. Of note among these developing models was that of geometrically plotted supply and demand curves, relatively simple mechanisms for determining optimal forms of production according to the ideal price points (key sites for execution in a capitalist economy) that these models revealed. Armed with such powerful algorithmic-style decision making methods for rational determinations of maximal economic “efficiency,” liberal economics as both discipline, instrument and ideology, takes off, providing as it does a relatively uncluttered road map for the emerging capitalist forms of production at the time. As the example of Babbage indicates, and as examples such as Mirzoeff’s (“Control, Compute, Execute: Debt and New Media, The First Two Centuries”) recent reading of debt’s key role in new media’s origins highlight, economics (in both its rational interpretive mode and material practice) acts as a key and notably concurrent point of reference for computational thinking and practices. Consider, for instance, John von Neumann’s contributions to both fields. Or the economic substrate of venture capital's particular logic of funding that sprouts and orients so many of the silicon valleys(/alleys) in today's digital ecologies (while, again, funding none of the crucial interventions and work of the many "venture labourers"—Gina Neff, cited in Nakamura 2015—who help to support, curate and maintain these very ecologies). | ||
In his book ''An Inquiry into the Nature and Causes of the Wealth of Nations'', Smith (1843, p.184) famously evokes at one (and only one) point an image of the invisible hand: “by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention.” As Foucault (2008) makes clear in his analysis of the origins of various forms of liberal economic thought (particularly those of so-called ordoliberal and neoliberal forms), this is a blackboxing of economic responsibility and decision making, pronouncing economic causes as both autonomous and unknowable - and thus unregulatable and ungovernable. Furthermore, such an invisible instruction pointer is a handy processing device for paving the way for a kind of automation of the economy in the service of this liberated invisible hand. As in Chun’s examples, it is the blackboxing, the making invisible of the source in the name of a kind of “sorcery” that is key here (this invisible hand as a perhaps more mundane but nevertheless similarly shadowy form of "dark interpreter" such as Howse, 2015, speaks of). This invisibility is a key and rather handy feature (Foucault 2008, p.279-80) for such an executable mode of liberal economics, prescribing as it does a crucial and ideologically conditioned act of deferral in the decision making powers of all the subjects that come under its sweep. In this notable characterisation of the model as a seemingly driverless system, the model in question is of course able to continue to channel existing and emergent forms of systemic bias, human discrimination, environmental manipulation and other forms of age old and ongoing feedback loops that were and continue to be at play in the production of such models—only now with a potentially more unregulated, and thus potentially more extensive scope to its executability. | In his book ''An Inquiry into the Nature and Causes of the Wealth of Nations'', Smith (1843, p.184) famously evokes at one (and only one) point an image of the invisible hand: “by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention.” As Foucault (2008) makes clear in his analysis of the origins of various forms of liberal economic thought (particularly those of so-called ordoliberal and neoliberal forms), this is a blackboxing of economic responsibility and decision making, pronouncing economic causes as both autonomous and unknowable - and thus unregulatable and ungovernable. Furthermore, such an invisible instruction pointer is a handy processing device for paving the way for a kind of automation of the economy in the service of this liberated invisible hand. As in Chun’s examples, it is the blackboxing, the making invisible of the source in the name of a kind of “sorcery” that is key here (this invisible hand as a perhaps more mundane but nevertheless similarly shadowy form of "dark interpreter" such as Howse, 2015, speaks of). This invisibility is a key and rather handy feature (Foucault 2008, p.279-80) for such an executable mode of liberal economics, prescribing as it does a crucial and ideologically conditioned act of deferral in the decision making powers of all the subjects that come under its sweep. In this notable characterisation of the model as a seemingly driverless system, the model in question is of course able to continue to channel existing and emergent forms of systemic bias, human discrimination, environmental manipulation and other forms of age old and ongoing feedback loops that were and continue to be at play in the production of such models—only now with a potentially more unregulated, and thus potentially more extensive scope to its executability. | ||
A compelling example of shifting the site of execution, one that specifically queries execution as it transpires across economics, politics and computation, can be found in the artist duo YoHa’s (Graham Harwood and Matsuko Yokokoji) work *Invisible Airs* (YoHa 2011, with assistance by Stephen Fortune). This project feature a series of “contraptions”[5] aimed at performing the largely opaque contours of Bristol City Council’s expenditures, which at the time had undergone a shift to seeming transparency as a result of the council’s open data initiative (see https://opendata.bristol.gov.uk/ for how far the city’s open data policy has developed since). As they put it in their project proposal to the council, “We will take the database record, and unfold it – reverse engineer it – to understand it as a technology of power” (YoHa 2011). | |||
As with much of YoHa’s work, the project resulted in several outputs (see Alistair Oldham’s helpful short documentary on the project which gives an overview of the project’s many different parts: https://vimeo.com/36567631). At the centre are four contraptions that shift individual entries from Bristol City Council expenditures from the .CSV files of the “Council budgets and spending Expenditure over £500” open data sets into different manifestations of pneumatic devices that materialise the data in various fashions. These were: the Open Data Book Stabber, the Public Expenditure Riding Machine, the Expenditure Filled Spud Gun and the Older People Pneumatic Brusher. | |||
[[File:YoHa_ODBS.gif|center]] | [[File:YoHa_ODBS.gif|center]] | ||
<center>YoHa, Invisible Airs, "Open Data Book Stabber"</center> | |||
The contraptions were demonstrated to members of the public in various locations around the city. YoHa were also able to give an hour long presentation of ''Invisible Airs'' held before the Lord Mayer of Bristol in the chamber room of the council. This “Pneumatic Soiree” involved an explanation of the project, descriptions of YoHa’s forensic investigations into the council’s expenditure data and interactions with the council’s various IT officers, and an engaging genealogical excursion through the overlapping histories of databases, pneumatics and technologies of power (full video of the presentation available here: http://yoha.co.uk/Soiree). Each of these interventions aimed to acknowledge ongoing historical and social formations of interstitial gaps between knowledge and power. In this case, coming as it did at the still early days of so-called open data initiatives on the part of government, YoHa were particularly sensitive to the “gap between the wider public's perception of data” and the general “form of indifference toward the expectations of this kind of open data initiative,” with the pieces attempting to create “A partial remedy for this indifference” through “making data more vital” - while also “taking a more critical view of transparency itself” (Harwood 2015, p.93). | |||
<center>YoHa, Invisible Airs, " | [[File:YoHa_bills.gif|center]] | ||
<center>YoHa, Invisible Airs, "Bristol City Council public expenditures"</center> | |||
Inspired in part by the rich local history of the Bristol Pneumatic Institute, YoHa (and assistant Stephen Fortune) devised ways of having pneumatic actuators release jolts of compressed air in relation to the amount of public money being spent by the council. This involved a careful unpacking of the so-called open data as it was structured and arranged via the various computable parameters that work so as to discipline the many potential moveable parts of such database systems. This includes the defined data field containers (e.g. “ACCOUNTING SUSPENSE CODE”, “WASTE OPERATIONS”, “OLDER PEOPLE”, “DRUG STRATEGY”, “FLEET ACCIDENT HOLDING ACCOUNT”, “NON-OPERATIONAL PROPERTIES”) that prescribe what the database can and cannot hold. As Harwood (2015, p.91) notes, “constructing a container called ‘street’ excludes living in the sea or in a forest or on the moon or, in the case of Gypsies, by the side of the road.” Each of these containers are arranged in the tabular form of CSV (comma separated value) files in which certain relations and hierarchies are established and from which certain queries can be encoded via scripts that enable the pulling of data from the parent databases. In order to make the relevant data extractable and amenable to the intentions of their project, YoHa were forced to unravel many of the various and at times opaque granularities and relational lineages of the 20,000 comma-separated lines of the database (such as certain differences in the Open Data CSV and the original databases from which it was formed). This involved further discussion with the council’s IT team, repurposing aspects of the datasets so that they could be integrated with YoHa’s own Perl scripts for the translating of expenditures into pneumatic values for eventual release in the contraptions themselves. In each such transformation various forms of agency and interstitial energies emerge, with databases being particularly powerful “transducers of knowledge and power rapidly moving through us, separating us, reforming us, folding us up into their parts. . . allowing new forms of power to emerge from the machine’s ability to push and process large sets of information into the gaps between knowledge and power. As databases order, compare and sort they create new views of the information they contain. New perspectives amplify, speed-up and restructure particular forms of power as they supersede others” (YoHa 2011). | |||
[[File:YoHa_PERM2.gif|center]] | |||
<center>YoHa, Invisible Airs, "Public Expenditure Riding Machine"</center> | <center>YoHa, Invisible Airs, "Public Expenditure Riding Machine"</center> | ||
After carrying out several workshops in which the contraptions were shown to the public, YoHa felt that while the open data book stabber had a certain arresting visual quality to it, the most engaging of the contraptions from the point of view of the project was the Public Expenditure Riding Machine: "people really like the book stabbing, but it's kind of conceptual. . . because people don't physically experience it, it falls away quite quickly. So we really learnt about the physicality, it's actually when people's bodies are moved by the data that then they really engage with it" (YoHa, in Oldham 2012). As with Howse’s work, the site of the body proves a particularly productive one for shifting onto and manifesting the computational. It seems a particularly productive method in an age in which ubiquitous computing exponentially multiplies sites of execution into what can be seen as an attempt to create so many virtualised pricks whose blackboxed, often seemingly banal nature have a way of bringing computation and its entwining with forms of power below the threshold of distinguishable experience. In transductions such as Howse’s and YoHa’s, a particular shifting of the computational occurs via the demarcating of a site in which a readily visible contraption forcefully compresses aspects of computation onto the body, making sensible certain contours of execution and its power to change things through this very act of shifting the instruction pointer. In YoHa's case, it is not a transduction whose main purpose is aesthetic experimentation (e.g. a vast amount of media art to date), but rather one with the explicit goal of registering certain “pains” and the formative, if often invisible, expressions of power that these computing processes manifest in their every execution. | |||
'''sites of execution: interface''' | '''sites of execution: interface''' | ||
In the eventual dissertation version of this writing, | //In the eventual dissertation version of this writing, the idea currently is for this series of shifting sites of execution to act as a kind of prelude that opens up to a more extended analysis of the implementation of the clickable like button in Facebook, highlighting the history of its implementation and expanding functionality, and relating it to the earlier examples of sites of execution (e.g. like button as a simple flippable switch that acts as a counter for social demand and algorithmic supply of Facebook content, an interstice, a visible "hand" that gathers many underlying bits of code and economies around it, etc.). | ||
[[File:Like.png|center]] | [[File:Like.png|center]] | ||
Line 61: | Line 62: | ||
[4] Howse (2015): "But what is exactly this hidden place of the now, where symbolic orders, where language becomes material change at a quantum level? Where words are subjected to literal and not literary un-angelled noise. . . This non-place is the CPU or Central Processing Unit, anchoring any technology, AKA. the Dark Interpreter". | [4] Howse (2015): "But what is exactly this hidden place of the now, where symbolic orders, where language becomes material change at a quantum level? Where words are subjected to literal and not literary un-angelled noise. . . This non-place is the CPU or Central Processing Unit, anchoring any technology, AKA. the Dark Interpreter". | ||
[5] A | [[File:YoHa_contraption.png|300px|right|thumbnail|slide from YoHa's "Pneumatic Soiree" presentation]] | ||
[5] A term used by YoHa to signal "a domain where the technical overlaps with the imaginary." A contraption in YoHa's adoption of the term is also any device notable for its highlighting of "the unstable state before invention becomes normalised," one whose "inherently unstable refusal of utility [emphasises] the forces in the machine that could break it" (Harwood 2015, p.73). Finally, through their very non-discursive, irrational qualities, contraptions also have a way of materially shifting the needle of interpretation, highlighting the violence of discourse itself via their own "unsafe" methods. Thus, "A pneumatic open-data book stabber is a contraption while a nuclear submarine is reasonable" (YoHa 2011), the contraption casting an oblique light on the discursive machines it plays off of, creating "an unsafe space in which the non-discursive can mix freely with unhinged imaginings. It is a place where limitations of knowledge and discipline are curiously redundant and become a basic method of enquiry" (Harwood 2015, p.73-4). | |||
'''references''' | '''references''' | ||
Line 89: | Line 91: | ||
Howse, Martin. “Dark Interpreter – Provide by Arts for the hardnesse of Nature.” ''Occulto Magazine'', Issue δ, 2015. | Howse, Martin. “Dark Interpreter – Provide by Arts for the hardnesse of Nature.” ''Occulto Magazine'', Issue δ, 2015. | ||
Kittler, Friedrich. "There is No Software. | Kittler, Friedrich. "There is No Software" (originally published in 1992). In ''Literature, Media, Information Systems''. Ed. john Johnston. Overseas Publishers Association: Amsterdam, 1997. | ||
Kittler, Friedrich. ''Optical Media: Berlin Lectures 1999''. Tr. Anthony Enns. Cambridge, England: Polity Press, 2010. | Kittler, Friedrich. ''Optical Media: Berlin Lectures 1999''. Tr. Anthony Enns. Cambridge, England: Polity Press, 2010. | ||
Line 98: | Line 100: | ||
Nakamura, Lisa. “The Unwanted Labour of Social Media: Women of Color Call Out Culture as Venture Community Management.” ''New Formations: a journal of culture, theory, politics'', 106-112, 2015. https://lnakamur.files.wordpress.com/2011/01/unwanted-labor-of-social-media-nakamura1.pdf | Nakamura, Lisa. “The Unwanted Labour of Social Media: Women of Color Call Out Culture as Venture Community Management.” ''New Formations: a journal of culture, theory, politics'', 106-112, 2015. https://lnakamur.files.wordpress.com/2011/01/unwanted-labor-of-social-media-nakamura1.pdf | ||
Oldham, Alistair. ''Invisible Airs''. Acacia Films, 2012. Film. https://vimeo.com/36567631 | |||
Pasquinelli, Matteo. "What an Apparatus is Not: On the Archeology of the Norm in Foucault, Canguilhem, and Goldstein." ''Parrhesia'', n. 22, May 2015, pp. 79-89. Web. http://www.parrhesiajournal.org/parrhesia22/parrhesia22_pasquinelli.pdf | Pasquinelli, Matteo. "What an Apparatus is Not: On the Archeology of the Norm in Foucault, Canguilhem, and Goldstein." ''Parrhesia'', n. 22, May 2015, pp. 79-89. Web. http://www.parrhesiajournal.org/parrhesia22/parrhesia22_pasquinelli.pdf | ||
Line 107: | Line 111: | ||
Smith, Adam. ''An Inquiry into the Nature and Causes of the Wealth of Nations''. Edinburgh: Thomas Nelson, 1843 [originally published 1776]. | Smith, Adam. ''An Inquiry into the Nature and Causes of the Wealth of Nations''. Edinburgh: Thomas Nelson, 1843 [originally published 1776]. | ||
Turing, Alan M. ”On Computable Numbers, with an application to the ''Entscheidungsproblem''.” ''Proceedings of the London Mathematical Society'', Second Series, V. 42, | Turing, Alan M. ”On Computable Numbers, with an application to the ''Entscheidungsproblem''.” ''Proceedings of the London Mathematical Society'', Second Series, V. 42, 1937, p. 249. Print. | ||
Turing, Alan M. "Intelligent Machinery (manuscript)". The Turing Archive. 1948. | Turing, Alan M. "Intelligent Machinery (manuscript)". The Turing Archive. 1948. | ||
Whitelaw, Mitchell. “Sheer Hardware: Material Computing in the Work of Martin Howse and Ralf Baecker.” ''Scan: Journal of Media Arts Culture'', Vol.10 No.2, 2013. Web. http://scan.net.au/scn/journal/vol10number2/Mitchell-Whitelaw.html | Whitelaw, Mitchell. “Sheer Hardware: Material Computing in the Work of Martin Howse and Ralf Baecker.” ''Scan: Journal of Media Arts Culture'', Vol.10 No.2, 2013. Web. http://scan.net.au/scn/journal/vol10number2/Mitchell-Whitelaw.html | ||
YoHa. ''Invisible Airs: Database, Expenditure and Power''. ''YoHa'', 2011. Web. http://yoha.co.uk/invisible |
Latest revision as of 19:45, 17 May 2016
//a direct continuation, in draft form, of earlier text for *.exe (ver0.1) for discussion at *.exe (ver0.2) meetup. also: S.T.A.B.B.Y.
sites of execution: interstice
- “the empty container implies the existence of the object that it is supposed to possess”
- —Graham Harwood ("Pneumatic Soiree" presentation, part of YoHa's Invisible Airs, 2011)
Execution in the world is a knotty, complex thing, and there are further ways that the notion of a site of execution might be productively considered. Foucault’s employment of the concept of “interstice” is one such helpful resource. The term appears in Foucault’s outline of what constitutes a genealogical approach, “Nietzsche, Genealogy, History.” For Foucault (at this juncture in his thinking), genealogy should treat the history of any emergent interpretations as a continued interplay and contestation of processes of "descent," "emergence" and "domination," as they alight and intersect from one act of interpretation and resultant manifestation to another. In speaking of emergence, Foucault describes how any study into the emergence of discursive forces should not be that of a speculation on origins, but rather a study of the “interstices” of existing, often confrontational, forces and how these interstices can be seen to provide traction or take on certain generative qualities. Interstitials of emergence are themselves formed from both existing energies of "descent" (any enduring forces that can be seen to fed into sites of emergence) and "domination" (the ability of any existing or emergent discursive force to harden into stabilised, recognisable and thus enforceable forms of expression). Foucault (1984, p.84-5), drawing directly on Friedrich Nietzsche’s genealogical pursuits[3], describes a site of emergence as a “scene” where competing forces and energies “are displayed superimposed or face-to-face.” The interstice in this instance is a site of confrontation of such forces, it is “nothing but the space that divides them, the void through which they exchange their threatening gestures and speeches. . . it is a ‘non-place,’ a pure distance, which indicates that the adversaries do not belong to a common space.” In this formulation then, an interstice is a scene, a superimposition and a confrontational void through which compelling but, in one sense or another, competing forms of potential gestures and energies generate certain expressive powers of exchange and action. The emphasis in this conceptual abstraction is not so much on the materiality of an actual site, but it nevertheless points to a sense of a key point of interoperating lines of force, for which the term interstice provides the overarching rubric.
Of particular interest in contrasting this notion of interstice with that of the direct shifting and launching of computational processes into the skin of a user or the soil of the earth in the examples of Howse, is the way in which Foucault’s characterisation, while containing a similar notion of confrontation and interaction, nevertheless is characterised as a superimposition involving a “pure distance” (as will be returned to later, Howse himself, without any hint of Foucault, nevertheless ends up using the same term of "non-place" as a descriptor of the CPU[4]). In Foucault’s site of the interstice then, the emphasis is shifted from direct contact to this distance between superimposed forces and the way in which the resultant demarcated gaps in between these forces can create compelling drives and potentially productive, generative forms of interplay. To couch it in the terms of this essay, the focus in any such encounters in a site of execution then becomes just as much on the uncomputable as it is on the computable, with the interstice as the emergent site and breeding ground of a will to interrogate or close this very gap.
As a simple example, consider the originary insterstitial gap that Turing’s (1937) machine model opens up, with its mandate of discrete, symbolic elements capable of being enumerated and made into effectively calculable algorithms for execution upon and by machines. In the further materialization of Turing’s thesis into actual computing machines, the act of making things discrete, so as to be computable, becomes one of establishing machine-readable cuts: the switchable on and off state elements, or flip-flops executed via logic gates used to store and control data flow. Such flippable states constitute a material basis that allows for the writing and running of the executable binary instructions of machine code upon a computing machine, with the standard process for preparing and executing so-called "source" code involving a translation into machine executable code being via a compiler, interpreter or assembler (or some amalgamation of properties of each of these approaches). Although any manifestation of a Turing machine necessarily involves the storage and transcription of such discrete symbols onto a continuous flux of what (in contrasting relation) are described as “analogue” materials, this marked independence of symbol from substrate (an independence strong enough for Turing to anoint it as “universal”), their outright “indifference” (Schrimshaw 2012; Whitelaw 2013) and yet simultaneous reliance upon each other for functioning purposes, is a kind of “pure”, interstitial distance that Foucault’s formulation can be seen to point at. A key emergent “medial appetite” instigated here being the ongoing way in which various extensive and intensive qualities of the digital and analogue (the discrete and the continuous) are always in some respect “out of kilter” (Fuller 2005, p.82) with one another. Out of such interstitials, the emergence of things such as the seemingly pathological drive to have the digital replicate the analogue and overcome the “lossy” gaps therein, the constant pushing towards faster speeds, higher resolutions, “cleaner” signals, “smarter” machines and so on. Such interstitial, protocological sparks are often readily evident in even just the collision course nature of the names we give to these endeavours: “Internet” of “Things”; “Artificial” “Intelligence”; "Virtual" "Reality". A particularly productive agential cut then, this incision of the digital and its seismic materialisation in computational form, giving birth as it has to the “manic cutter known as the computer" (Kittler 2010, p.228).
Such a method of divide and conquer can be understood as essential to most discursive systems, the method in question working to make entities executable according to their particular logics and delimited needs. In relation to execution and its capacities of automation, the productive insterstitial gap involved instigates a potential drive towards making more things executable according to the logic of the executing process in question, of enclosing more entities and procedures into its discursive powers. This is a will towards what Foucault gives the name of domination to, the way in which sites of emergence can be understood to have the ability to sediment into dominant forms of understanding and expression. Domination is the consolidating of an emergent will, a species-like hardening of a system of interpretation and a bending of other wills and forces to its own compositional orientations. An engraving of power into the bodies it would make accountable to it. Such a self-amplifying will to power often gains further traction by convincing all parties involved of its own normative necessity and inevitability, pushing to the side any notions of the contingent accumulation of its situation and reconfigurability of its aggregate parts. In its peak stage, domination is when the executability of an emergent will—computational, cultural, political, etc.—is able to attain a level of everyday, almost automatic execution. Thus the norm critique of Foucault points towards an understanding of execution and executability as a productive ability by a process to make dominant and sustainable its particular discursive practices and requirements in a world of many other potential processes. And yet all the while the frictions and resistance of the limit, of bodies, time, and space, as well as the "vibrant" (Bennett 2010) qualities of matter itself (a material agency which Foucault could be said to be less attuned to in his thinking). Turing's computing machine, as sketched according to its logical capacities, may have been granted "an infinite memory capacity obtained in the form of an infinite tape..." (Turing 1944, p.3), but execution in the world is rather more bounded and situated.
Wendy Chun's work provides numerous examples of what might be seen as peak stage and dominant processes of computationally-infused practices of the present, such as in her recent characterisations of “habitual new media” and the way in which such media have a notable quality of driving users into a cycle of “updating to remain the same” (Chun 2016). For instance, the currently accepted habit of a continually shifting and often powerfully political set of terms of services that are rolled out by many of the currently dominant networked computing platforms to their accepting users. As Chun and others (e.g. Cayley 2013) point out, every such instance of updating and agreeing to terms involves a certain abnegation of decision and responsibility. Still, the computational codes that underwrite more and more of contemporary cultural and political activities of the moment have a particular way of selling this abnegation as the best means with which to guarantee its subjects a sustainable living in the digital era (whether what is being sold is a data mining process for being kept safe from terrorism or a particular social feature to be implemented in the name of a better collective experience). Again, a pincer-like, discursive binding can be seen to be in play in these impositions, whereby the tools in question are “increasingly privileged. . . because they seem to enforce automatically what they prescribe” (Chun 2011, p.91-2). Chun goes on to point out how such moves can be seen as a promotion of code as logos, “code as source, code as conflated with, and substituting for, action… [producing] (the illusion of) mythical and mystical sovereign subjects who weld together norm with reality, word with action.” (Chun 2011, p.92). As can be seen to be enacted in many of the ongoing crisis-oriented forms of governance today, in which a state of emergency of one kind or another is implied as the natural state of things, the application of computational modes of execution into such situations can be viewed as almost a perfection of the potential violence of the polis, a cutting out of the executive so as to become the executor (Chun 2011, p.101). The conditional, protological command summed up in a Gilbert & Sullivan lyric: “Defer, Defer, to the Lord High Executioner!”
Such a pointing towards the ways in which automation involves an acceptance and thus deferral of a moment of decision, also points to how the initialising of the decision engine of an executable computer program is both the initiating of a decision making process but also a termination, in that the moment an executable process is run decision making processes at the level of the executing code in question are thus set. Executing decisions become predefined, as do the parameters of the computable inputs and outputs. This is not to bemoan such a situation, but rather to consider this potent quality of computation as it is put into sites of encounter with the discursive entities that are brought within its range. Programmed habituation has demonstrated a range of both exciting and predictable vectors thus far (with Chun's examples highlighting a wide spectrum of its contours). Unsurprisingly, one such vector is the push on the part of certain dominant manifestations of programmed executability to extend the reach of their particular methods and terms. As with any discursive force, one of programmed execution’s most powerful effects is not only its deferral of responsibility, but the expansion of power that can occur once such a deferral is normalised. The interstice's site of contestation and potential emergence fading with every such deferral to the particular powers in question.
sites of execution: invisible hands
shift.
In histories of computing, the punched cards and partly automated execution of Joseph Maria Jacquard’s looms and Charles Babbage and Ada Lovelace’s subsequent implementation of a punched card model of programming for Babbage’s Analytical Engine make for popular launching points for discussions on the epistemic and material breakthroughs that led to the general purpose computers of the present. As writers such as Sadie Plant (1995) and Nicholas Mirzoeff (2016) have highlighted, in the midst of his transition from his work on the Difference Engine to that of the Analytical Engine, Baggage undertook an extended study of the workings and effects of automated machines used in various forms of manufacture. The research resulted in the text On the Economy of Machinery and Manufactures, published in 1832 to relatively popular acclaim at the time. The text features extended explanations of the mechanical operations of such machines in the factories of Babbage's time and their accompanying forms of specialization, standardisation and systemisation, as reflected both within the factories and also in the economies emerging out of these mechanised factories as a whole. The influence of Adam Smith and emerging thoughts around economic models of efficiency is particularly evident, with Babbage emphasising the benefits that might be achieved with greater division of labour. Babbage (cited in Mirzoeff p.4): “the most important principle on which the economy of a manufacture depends on is the division of labour.” As witnessed in a passage describing Gaspard de Prony’s work with turning unemployed servants and wig dressers into “computers” capable of calculating trigonometric tables by means of addition and subtraction, the idea can be seen to take hold on Babbage that such factories and strategically partitioned forms of labour might be understood as schematics and material setups for entirely mechanical forms of computation. Babbage would later look back on these early factories as prototypes for his “thinking machines,” with his naming of the “mill” as the central processing unit for his Analytical Engine figuring as the most obvious token of this influence on his combined computational and material thinking. But just as Babbage's model for computing proved prototypical of those that were to come, so too did this partitioning and making invisible of the workforces involved in these developments persist into the present, whether one is speaking of the above-mentioned servants and wig dressers, Ada Lovelace's work with Babbage, or the many other women, minorities and other all-too-often precariously employed workers involved therein. These obscured bodies, their marginalised, “nimble fingers” (Nakamura 2014), continue to remain at best as “footnotes” (Plant 1995, p.63-4) in histories and practices of computing, despite their crucial role in the creation, running and sustaining of the existence of computational economies, whether at the level of manufacturing (Nakamura 2014), programming (Chun 2015; Plant 1995) or online community management and support (Nakamura 2015).
While discussions of invisibility and power typically highlight its ability to cloak the "dirty work" of any particular will towards domination, it is worth remembering that acts of blackboxing and making invisible are often discursively and practically implemented in the name of efficiency. Adam Smith and the rising stars of liberal economist thinkers in Babbage's time were notable for their powerful new formulations of economic rationality and self interest, devising what quickly proved powerfully attractive models for both the operational manufacture of goods and the interpretation of the flow of such goods within the economy. Of note among these developing models was that of geometrically plotted supply and demand curves, relatively simple mechanisms for determining optimal forms of production according to the ideal price points (key sites for execution in a capitalist economy) that these models revealed. Armed with such powerful algorithmic-style decision making methods for rational determinations of maximal economic “efficiency,” liberal economics as both discipline, instrument and ideology, takes off, providing as it does a relatively uncluttered road map for the emerging capitalist forms of production at the time. As the example of Babbage indicates, and as examples such as Mirzoeff’s (“Control, Compute, Execute: Debt and New Media, The First Two Centuries”) recent reading of debt’s key role in new media’s origins highlight, economics (in both its rational interpretive mode and material practice) acts as a key and notably concurrent point of reference for computational thinking and practices. Consider, for instance, John von Neumann’s contributions to both fields. Or the economic substrate of venture capital's particular logic of funding that sprouts and orients so many of the silicon valleys(/alleys) in today's digital ecologies (while, again, funding none of the crucial interventions and work of the many "venture labourers"—Gina Neff, cited in Nakamura 2015—who help to support, curate and maintain these very ecologies).
In his book An Inquiry into the Nature and Causes of the Wealth of Nations, Smith (1843, p.184) famously evokes at one (and only one) point an image of the invisible hand: “by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention.” As Foucault (2008) makes clear in his analysis of the origins of various forms of liberal economic thought (particularly those of so-called ordoliberal and neoliberal forms), this is a blackboxing of economic responsibility and decision making, pronouncing economic causes as both autonomous and unknowable - and thus unregulatable and ungovernable. Furthermore, such an invisible instruction pointer is a handy processing device for paving the way for a kind of automation of the economy in the service of this liberated invisible hand. As in Chun’s examples, it is the blackboxing, the making invisible of the source in the name of a kind of “sorcery” that is key here (this invisible hand as a perhaps more mundane but nevertheless similarly shadowy form of "dark interpreter" such as Howse, 2015, speaks of). This invisibility is a key and rather handy feature (Foucault 2008, p.279-80) for such an executable mode of liberal economics, prescribing as it does a crucial and ideologically conditioned act of deferral in the decision making powers of all the subjects that come under its sweep. In this notable characterisation of the model as a seemingly driverless system, the model in question is of course able to continue to channel existing and emergent forms of systemic bias, human discrimination, environmental manipulation and other forms of age old and ongoing feedback loops that were and continue to be at play in the production of such models—only now with a potentially more unregulated, and thus potentially more extensive scope to its executability.
A compelling example of shifting the site of execution, one that specifically queries execution as it transpires across economics, politics and computation, can be found in the artist duo YoHa’s (Graham Harwood and Matsuko Yokokoji) work *Invisible Airs* (YoHa 2011, with assistance by Stephen Fortune). This project feature a series of “contraptions”[5] aimed at performing the largely opaque contours of Bristol City Council’s expenditures, which at the time had undergone a shift to seeming transparency as a result of the council’s open data initiative (see https://opendata.bristol.gov.uk/ for how far the city’s open data policy has developed since). As they put it in their project proposal to the council, “We will take the database record, and unfold it – reverse engineer it – to understand it as a technology of power” (YoHa 2011).
As with much of YoHa’s work, the project resulted in several outputs (see Alistair Oldham’s helpful short documentary on the project which gives an overview of the project’s many different parts: https://vimeo.com/36567631). At the centre are four contraptions that shift individual entries from Bristol City Council expenditures from the .CSV files of the “Council budgets and spending Expenditure over £500” open data sets into different manifestations of pneumatic devices that materialise the data in various fashions. These were: the Open Data Book Stabber, the Public Expenditure Riding Machine, the Expenditure Filled Spud Gun and the Older People Pneumatic Brusher.
The contraptions were demonstrated to members of the public in various locations around the city. YoHa were also able to give an hour long presentation of Invisible Airs held before the Lord Mayer of Bristol in the chamber room of the council. This “Pneumatic Soiree” involved an explanation of the project, descriptions of YoHa’s forensic investigations into the council’s expenditure data and interactions with the council’s various IT officers, and an engaging genealogical excursion through the overlapping histories of databases, pneumatics and technologies of power (full video of the presentation available here: http://yoha.co.uk/Soiree). Each of these interventions aimed to acknowledge ongoing historical and social formations of interstitial gaps between knowledge and power. In this case, coming as it did at the still early days of so-called open data initiatives on the part of government, YoHa were particularly sensitive to the “gap between the wider public's perception of data” and the general “form of indifference toward the expectations of this kind of open data initiative,” with the pieces attempting to create “A partial remedy for this indifference” through “making data more vital” - while also “taking a more critical view of transparency itself” (Harwood 2015, p.93).
Inspired in part by the rich local history of the Bristol Pneumatic Institute, YoHa (and assistant Stephen Fortune) devised ways of having pneumatic actuators release jolts of compressed air in relation to the amount of public money being spent by the council. This involved a careful unpacking of the so-called open data as it was structured and arranged via the various computable parameters that work so as to discipline the many potential moveable parts of such database systems. This includes the defined data field containers (e.g. “ACCOUNTING SUSPENSE CODE”, “WASTE OPERATIONS”, “OLDER PEOPLE”, “DRUG STRATEGY”, “FLEET ACCIDENT HOLDING ACCOUNT”, “NON-OPERATIONAL PROPERTIES”) that prescribe what the database can and cannot hold. As Harwood (2015, p.91) notes, “constructing a container called ‘street’ excludes living in the sea or in a forest or on the moon or, in the case of Gypsies, by the side of the road.” Each of these containers are arranged in the tabular form of CSV (comma separated value) files in which certain relations and hierarchies are established and from which certain queries can be encoded via scripts that enable the pulling of data from the parent databases. In order to make the relevant data extractable and amenable to the intentions of their project, YoHa were forced to unravel many of the various and at times opaque granularities and relational lineages of the 20,000 comma-separated lines of the database (such as certain differences in the Open Data CSV and the original databases from which it was formed). This involved further discussion with the council’s IT team, repurposing aspects of the datasets so that they could be integrated with YoHa’s own Perl scripts for the translating of expenditures into pneumatic values for eventual release in the contraptions themselves. In each such transformation various forms of agency and interstitial energies emerge, with databases being particularly powerful “transducers of knowledge and power rapidly moving through us, separating us, reforming us, folding us up into their parts. . . allowing new forms of power to emerge from the machine’s ability to push and process large sets of information into the gaps between knowledge and power. As databases order, compare and sort they create new views of the information they contain. New perspectives amplify, speed-up and restructure particular forms of power as they supersede others” (YoHa 2011).
After carrying out several workshops in which the contraptions were shown to the public, YoHa felt that while the open data book stabber had a certain arresting visual quality to it, the most engaging of the contraptions from the point of view of the project was the Public Expenditure Riding Machine: "people really like the book stabbing, but it's kind of conceptual. . . because people don't physically experience it, it falls away quite quickly. So we really learnt about the physicality, it's actually when people's bodies are moved by the data that then they really engage with it" (YoHa, in Oldham 2012). As with Howse’s work, the site of the body proves a particularly productive one for shifting onto and manifesting the computational. It seems a particularly productive method in an age in which ubiquitous computing exponentially multiplies sites of execution into what can be seen as an attempt to create so many virtualised pricks whose blackboxed, often seemingly banal nature have a way of bringing computation and its entwining with forms of power below the threshold of distinguishable experience. In transductions such as Howse’s and YoHa’s, a particular shifting of the computational occurs via the demarcating of a site in which a readily visible contraption forcefully compresses aspects of computation onto the body, making sensible certain contours of execution and its power to change things through this very act of shifting the instruction pointer. In YoHa's case, it is not a transduction whose main purpose is aesthetic experimentation (e.g. a vast amount of media art to date), but rather one with the explicit goal of registering certain “pains” and the formative, if often invisible, expressions of power that these computing processes manifest in their every execution.
sites of execution: interface
//In the eventual dissertation version of this writing, the idea currently is for this series of shifting sites of execution to act as a kind of prelude that opens up to a more extended analysis of the implementation of the clickable like button in Facebook, highlighting the history of its implementation and expanding functionality, and relating it to the earlier examples of sites of execution (e.g. like button as a simple flippable switch that acts as a counter for social demand and algorithmic supply of Facebook content, an interstice, a visible "hand" that gathers many underlying bits of code and economies around it, etc.).
notes
[3] Foucault makes reference in this same passage to Nietzsche’s term Entstehungsherd, which it is worth noting is typically translated as “site of emergence”[3]. Entstehungsherd has also been translated as “breeding ground”, hinting both towards the medical sense of Herd as the “seat” of “focus” of a disease and also to its further designation as a site of any biological activity that makes something emerge (Emden 2014, p.138). This is part of a general transition in Nietzsche’s thinking, one that shows up most clearly in his Genealogy of Morals, whereby he aims to usurp notions around *Ursprung* (origin) with that of Entstehungsherd. Foucault’s description of the interstice can itself be further compared with his well known formulation of the appartus (dispositif) and its “grids of intelligibility,” in which systemic connections between heterogenous ensembles of elements are made productive. See Matteo Pasquinelli’s (2015) helpful commentary, “What an Apparatus is Not: On the Archeology of the Norm in Foucault, Canguilhem, and Goldstein, for a genealogy of Foucault’s employment of dispositif, one that like von Uexküll’s Umwelt sees a tracing back of continental philosophy to an earlier breeding ground of biophilosophical ruminations.
[4] Howse (2015): "But what is exactly this hidden place of the now, where symbolic orders, where language becomes material change at a quantum level? Where words are subjected to literal and not literary un-angelled noise. . . This non-place is the CPU or Central Processing Unit, anchoring any technology, AKA. the Dark Interpreter".
[5] A term used by YoHa to signal "a domain where the technical overlaps with the imaginary." A contraption in YoHa's adoption of the term is also any device notable for its highlighting of "the unstable state before invention becomes normalised," one whose "inherently unstable refusal of utility [emphasises] the forces in the machine that could break it" (Harwood 2015, p.73). Finally, through their very non-discursive, irrational qualities, contraptions also have a way of materially shifting the needle of interpretation, highlighting the violence of discourse itself via their own "unsafe" methods. Thus, "A pneumatic open-data book stabber is a contraption while a nuclear submarine is reasonable" (YoHa 2011), the contraption casting an oblique light on the discursive machines it plays off of, creating "an unsafe space in which the non-discursive can mix freely with unhinged imaginings. It is a place where limitations of knowledge and discipline are curiously redundant and become a basic method of enquiry" (Harwood 2015, p.73-4).
references
Bennett, Jane. Vibrant Matter: a Political Ecology of Things. Durham: Duke University Press, 2010. Print.
Cayley, John. “Terms of Reference & Vectoralist Transgressions: Situating Certain Literary Transactions over Networked Services.” Amodern 2: Network Archaeology. 2013. Web. http://amodern.net/article/terms-of-reference-vectoralist-transgressions/
Chun Wendy Hui Kyong. "On Software, or the Persistence of Visual Knowledge No Access." Grey Room, Winter 2005, No. 18: 26–51. Print.
Chun, Wendy Hui Kyong. “Crisis, Crisis, Crisis, or Sovereignty and Networks”. Theory, Culture & Society, Vol. 28(6):91-112, Singapore: Sage, 2011. Print.
Chun, Wendy Hui Kyong. Updating to Remain the Same: Habitual New Media. Cambridge, Massachussetts: The MIT Press, 2016. Print.
Emden, Christian J. Nietzsche's Naturalism: Philosophy and the Life Sciences in the Nineteenth Century. Cambridge University Press, 2014. Print.
Foucault, Michel. “Nietzsche, Genealogy, History” (originally published in 1971). In The Foucault Reader. Ed. Paul Rabinow. New York: Pantheon, 1984. Print.
Foucault, Michel. The Birth of Biopolitics: Lectures at the Collège de France 1978–1979. Trans. Graham Burchell. New York: Palgrave Macmillan, 2008.
Fuller, Matthew. Media Ecologies: Materialist Energies in Art and Technoculture. Cambridge, Massachusetts: The MIT Press, 2005. Print.
Harwood, Graham. Database Machinery as Cultural Object, Art as Enquiry. Doctoral dissertation. Sunderland: University of Sunderland, 2015. Print.
Howse, Martin. “Dark Interpreter – Provide by Arts for the hardnesse of Nature.” Occulto Magazine, Issue δ, 2015.
Kittler, Friedrich. "There is No Software" (originally published in 1992). In Literature, Media, Information Systems. Ed. john Johnston. Overseas Publishers Association: Amsterdam, 1997.
Kittler, Friedrich. Optical Media: Berlin Lectures 1999. Tr. Anthony Enns. Cambridge, England: Polity Press, 2010.
Mirzoeff, Nicholas. “Control, Compute, Execute: Debt and New Media, The first Two Centuries.” After Occupy. Web. http://www.nicholasmirzoeff.com/2014/wp-content/uploads/2015/11/Control-Compute-Execute_Mirzoeff.pdf
Nakamura, Lisa. “Indigenous Circuits: Navajo Women and the Racialization of Early Electronics Manufacture.” American Quarterly, 66:4, December 2014, 919-941. https://lnakamur.files.wordpress.com/2011/01/indigenous-circuits-nakamura-aq.pdf
Nakamura, Lisa. “The Unwanted Labour of Social Media: Women of Color Call Out Culture as Venture Community Management.” New Formations: a journal of culture, theory, politics, 106-112, 2015. https://lnakamur.files.wordpress.com/2011/01/unwanted-labor-of-social-media-nakamura1.pdf
Oldham, Alistair. Invisible Airs. Acacia Films, 2012. Film. https://vimeo.com/36567631
Pasquinelli, Matteo. "What an Apparatus is Not: On the Archeology of the Norm in Foucault, Canguilhem, and Goldstein." Parrhesia, n. 22, May 2015, pp. 79-89. Web. http://www.parrhesiajournal.org/parrhesia22/parrhesia22_pasquinelli.pdf
Plant, Sadie. ”The Future Looms: Weaving Women and Cybernetics.” In Mike Featherstone and Roger Burrows (eds.), Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment. London: Sage Publications, 1995.
Schrimshaw, William Christopher. “Undermining Media.” artnodes, No.12: Materiality, 2012. Web. http://artnodes.uoc.edu/index.php/artnodes/article/view/n12-schrimshaw
Smith, Adam. An Inquiry into the Nature and Causes of the Wealth of Nations. Edinburgh: Thomas Nelson, 1843 [originally published 1776].
Turing, Alan M. ”On Computable Numbers, with an application to the Entscheidungsproblem.” Proceedings of the London Mathematical Society, Second Series, V. 42, 1937, p. 249. Print.
Turing, Alan M. "Intelligent Machinery (manuscript)". The Turing Archive. 1948.
Whitelaw, Mitchell. “Sheer Hardware: Material Computing in the Work of Martin Howse and Ralf Baecker.” Scan: Journal of Media Arts Culture, Vol.10 No.2, 2013. Web. http://scan.net.au/scn/journal/vol10number2/Mitchell-Whitelaw.html
YoHa. Invisible Airs: Database, Expenditure and Power. YoHa, 2011. Web. http://yoha.co.uk/invisible