It looks like you're new here. If you want to get involved, click one of these buttons!
Programming language: algorithmic notation / pseudocode.
Name of author/s: Heinz von Foerster.
Year circulated/published: 2003, in “Responsibilities of Competence”, from Understanding Understanding, reprint of keynote address adapted to a 1972 Journal of Cybernetics article.
Code
“The essential function of a Turing machine can be specified by five operations:”
(i) Read the input symbol x.
(ii) Compare x with z, the internal state of the machine.
(iii) Write the appropriate output symbol y.
(iv) Change the internal state z to the new state z´.
(v) Repeat the above sequence with a new input state x´.
“Similarly, the essential function of Maxwell’s Demon can be specified by five operations equivalent to those above:”
(i) Read the velocity v of the upcoming molecule M.
(ii) Compare (mv^2/2) with the mean energy mv^2/2 (temperature T) of, say, the cooler container (internal state T).
(iii) Open the aperture if (mv^2/2) is greater than mv^2/2; otherwise keep it closed.
(iv) Change the internal state T to the new (cooler) state T´ .
(v) Repeat the above sequence with a new incoming molecule M´.
Analysis
Heinz von Foerster is known as a second-order or second-generation cybernetician, establishing his theoretical program after Norbert Wiener and other first-generation theorists. The code / pseudocode above comes from a keynote address of 1972 in which he problematizes our notions of “hard” and “soft” sciences, arguing that physics and other hard sciences deal with soft problems, but that so-called soft sciences such as sociology deal with hard problems. For Foerster, cybernetics is the use of competencies acquired in the hard sciences to tackle difficult problems in the social sciences.
Having established this alliance of hard and soft sciences, Foerster laments, in an anecdotal way, that it is difficult to discern the “muse” of the discipline. In a way that shows considerable male bias, he decides that this muse is neither the Urania of astronomy nor the Demeter of agriculture, but the creature “sitting beside his desk”, a demon, none other than the being first postulated by James Clerk Maxwell in his Theory of Heat (1871). Maxwell’s demon is a thought experiment in which this intelligence inside a chambered heat canister “beats” entropy, but only on a molecular level in which individual particles can be ordered or kept hot. With the demon, Maxwell had intended to demonstrate the statistical nature of the second law of thermodynamics, that a system ultimately moves towards a greater state of entropy, showing that where larger groups of particles are concerned, negentropy, or order in disorder, is relative.
So in the code above, Foerster suggests that a Turing machine is nearly homologous to a demon machine. The differences in these procedures, although slight, come down to the fact that the demon machine conditionally modifies state, whereas the Turing machine processes state at a level of abstraction above the control structures of the demon machine. For the latter, refraining from opening a valve to let a faster molecule go into a hotter chamber is still a propagation if not modification of state. A point made by Foerster is that this is an order- producing regulatory process, part of a triad of regulation, entropy retardation, and computation.
Foerster, in this case, is well aware of the relativity of this order production and in no way believes that one can move beyond the second law. So he argues, humans must take responsibility for the justice and progress of the social sciences and do the best possible with computers, these (relative) order-producing machines. Still, these machines bridge the gap between the thermodynamic demon and information theory, but they also carry a “closed” quality that seems less desirable in the context of the open source movement and of distributed computing today. The gesture of Foerster’s informatics applied to the social is that it must be a closed system:
“As long as humanity treats itself as an open system by ignoring the signals of its sensors that report about its own state of affairs, we shall approach this singularity (the instability of humankind) with no breaks whatsoever. (Lately I began to wonder whether the information of its own state can reach all elements in time to act should they decide to listen rather than fight.) The goal is clear: we have to close the system to reach a stable population, a stable economy, and stable resources.” (Understanding, 197)
The cybernetic system he describes (perhaps puzzling to us now in its "closed" quality) instead only retains a metaphoric image of closure. A schematic, for instance, can be drawn up to represent or describe its functional basis, even if it remains non-isolated, in the world. What Foerster is calling for is feedback, that pockets of nature / culture reflexively interrogate themselves. Yet they are not completely closed. This notion in part resembles Deleuze and Guattari’s machinic heterogenesis in which an actor behavior is autopoietic or closed on itself, but at the collective level of world / society.
So a closed system is rather reflexive, one of the ways in which Katherine Hayles redeems Foerster in her How We Became Posthuman. But this opening of the closed system can go further. It is known that James Clerk Maxwell was devout and that the demon was like a monotheistic god, while also in Thompson’s interpretation, it becomes distinctly male and/or devoid of gender as an abstract boîte noire. This male aspect of the demon infiltrates Foerster’s keynote and has continued in the scientific literature of Maxwell’s demon, which often hopes that a true order will prevail, in the ordinary sense of the Judeo-Christian tradition. In contrast, could we not substitute the feminine goddess Sophia of the wisdom tradition as an intelligence with foreknowledge of particles or computational states, a "gendered" Maxwell’s demon? This would be an opening of the closed systems of classical cybernetics and a location of clarity in the gnostic embrace of entropy and irreversibility, and which would approach the reflexivity of Foerster in a quite different, radical way.
Questions
What other "machines" from any knowledge domain for which we could construct a simple algorithm are homologous to the Turing machine and demon machine in Foerster's text?
Without changing the algorithms provided by Foerster, can we read them in such a way that we overcome the binary opposition between information and entropy?
Given that these two algorithms bridge virtual and non-virtual systems, do they also function to enlarge or open up spaces in which programming or coding happens?
Could the opening of these ordering machines provide an impetus for code studies to examine programming in gender and race among other areas? Since, does not entropy / negentropy expand the domain of coding (at least by analogy) as something that happens on more than simply commercial computing devices (the computers we use or the computers of everyday life)?
Comments
@gregorybringman --
I have some questions about these excellent questions! To start with this question, which seems (if answered in the affirmative) to be the basis for your final question, on gender, race, and everyday life:
I want to be sure I understand the terms here.
By "they bridge," do you mean they act as a bridge in their joint homology, not that the Turing machine and the demon are each in and of themselves a bridge?
In "virtual and non-virtual", which term refers to which example? Are you using virtual in the sense of "not realized"? (A Turing machine, sans infinite length, is realized, hence non-virtual, Maxwell's demon is unrealized, hence virtual). Or do you mean virtual as in "not physically existing as such but made by software to appear to do so"? (A Turing machine is symbolic computational process, hence virtual, Maxwell's demon is a thermodynamic process, hence non-virtual).
@jeremydouglass :
Thanks for pointing out these distinctions! Let me try to clarify:
It still may be fruitful to investigate the first sense you propose of virtual and non-virtual, as we might consider software not virtual and therefore "material". Then how can one read the unrealized demon in terms of a material theory of mind and the virtual as well? Philosophers like Wittgenstein and Kittler point to human minds externalized in media and machines; maybe what is virtual in these situations is only an interpretation by another material reader of an "unrealized" material phenomenon. This kind of circles back to media on which writing is done and the post-structuralist revision, but my first intention was thinking of the demon as non-virtual and the Turing machine virtual.
I hope this clarifies...
@gregorybringman, over in the Week 2 main thread, I proposed that Week 2 main thread, I proposed that we think of these pieces as isomorphisms, to use Douglas Hofstadter's terminology.
The distinction he draws is before meaningful and meaningless isomorphisms. The measure is does the mapping of the one system on the other system produce statements that are both valid and true. He offers this in contrast to interpretation, which he sees as imposing outside assignments more loosely.
Since I don't have a systems approach (to critical thinking), I don't put the same valence on meaningful isomorphisms as he does in Godel, Escher, Bach, but I do think that distinction is relevant here where you are identifying von Foerster's isomorphism as "meaningful" and calling for others.
And I think you have also identified one of the strong interpretive moves of CCS, this mapping that has a kind of systematic integrity to its metaphoric translations.
@gregorybringman -- Thank you for clarifying! Your references to inscription, Derrida, science studies, and "a larger concept of coding in the world" are really helpful for me in understanding the broader stakes. Interesting that a materialist attention to "processes that aren't actual software programs" does seem to open the door to multiple ways of collapsing the virtual/non-virtual distinction -- at the least, it strengthens the homology.
I am still thinking about your question list, in particular "do they also function to enlarge or open up spaces in which programming or coding happens?" and how it leads into:
The first question quoted above, if answered in the affirmative, seems to leads directly into the second -- I had a sense that perhaps you already have a concrete concept of how this opening up / enlarging might play out? I'm curious. Might we start recognizing ordering machines everywhere?
Now that Week 3 on Race and Code is also underway: It might be interesting to think on this question about "ordering machines" in the context of the slave codes data and database, and some of the problems being addressed there. In particular, I wonder if it might be relevant to the slave economy (and its documentary paper trail) as a kind of ordering machine for enslaved persons, and how such concepts of code and encoding might also help us think through what happens when researchers (or bot writers) working with that material and creating their own databases / encoding schemes etc. build their own "ordering machine" in relation to the previous one -- although perhaps to a much different purpose.
Thanks, @markcmarino !
What is interesting about Foerster's isomorphism is that the literature on Maxwell's Demon and computing goes into much more detail on this thought experiment, but yet doesn't have quite the same metaphoric strength of Foerster's juxtaposed five line procedures.
I am thinking of the anthology of writings edited by Harvey S. Leff and Andrew F. Rex, called Maxwell's Demon: Entropy, Information, and Computing. This text is a fascinating collection of fundamental papers on the demon and computing, and I considered it required reading in preparation for my code critique. But it doesn't do what Foerster's simple juxtaposition does.
On top of this, once we present the two algorithms for our own analysis, they become a set of artifacts that can be overloaded with metaphors that remain completely tied to the material particulars of their code. So once it is established that a demon machine is an ordering machine like a Turing machine, it is then possible to ask what is the nature of this invisible, metaphoric demon that is there but elsewhere.
Substitutions or a chain of substitutions can then occur, i.e. demon -> monotheistic god -> gnostic deity. Or a demon can be an ordering machine in some other knowledge domain that realizes any one of the infinite implementations of a Turing machine (to which @ebuswell alluded in week 2), but in a domain specific algorithm.
From this vantage point, we can look at gender and race using the notion of "ordering machines", seeing them "everywhere", as @jeremydouglass proposes. Jeremy, I think that the slave codes database is a good fit for the intervention of "ordering machine". Implicit in computing and implicit in the demon thought experiment is a notion of control and systems of control, that humans have birthed computing machines but that these machines have become systems of oppression as well.
As a result, coding is one way to open a dialogue with these machines that oppress us, as well as is the act of the researchers in the slave codes data project incorporating themselves into the project data schema definition. This kind of mimics a Foucauldian disappearance of the subject from knowledge spaces, but the researchers in this case are active producers and they suddenly become visible in ironically allowing themselves to be contained by their code. In this gesture, they potentially make all of the slaves of their data visible as well.
So it would be interesting to write a five line procedure after Foerster that demonstrates a system of control that functions as a critique of racism, in a domain specific to the dialogue on race (like the slave codes database), most definitely.
Or "ordering machine" could provide an impetus to productive acts of resistance - no doubt with some irony in the fact that critiques proceed from ordering structures imposed on critics themselves...