towards day 18 @ sf, part 1

In learning by doing, or by interacting with and adjusting to materials, machines and models, experimentalists progressively discern what is relevant and what is not in a given experiment. In other words, the distribution of the important and the unimportant defining an experimental problem (what degrees of freedom matter, what disturbances do not make a difference) are not grasped at a glance the way one is supposed to grasp an essence (or a clear and distinct idea), but slowly brought to light as the assemblage stabilises itself through the mutual accommodation of its heterogeneous components. In this assemblage the singularities and affects of the experimentalist’s body are meshed with those of machines, models and material processes in order for learning to occur and for embodied expertise to accumulate.

– Manuel DeLanda, Intensive Science and Virtual Philosophy, Continuum, London, 2004, p. 177

Manuel DeLanda is addressing himself to the nonlinearity of the laboratory, with all its props, actors and machina, when it comes to pursuing phenomena and establishing causes. What the laboratory causes to happen in a nonlinear fashion is the progressive individuation of a desired outcome or phenomenon. Theoretical models, experimental precedents, in addition to the skill levels of the various actors, play their part in this assemblage, as do the usual materials, equipment and techniques of the laboratory, in a directed but adaptive and therefore open-ended sense. The sense of the experiment is after all what is at issue, its differentiation and stabilisation as particular to the given conditions, as individual and singular but replicable. In view of this sense, the experiment can be described not only as the enactment or actualisation of a probabalistic field but also the close observation and progressive definition of all the conditions in the field including their recording. This recording will of course conceal the actual causality by omission. It will presuppose the sense of a given experiment and like any other set of instructions omit the givens of the given, for example, that an experimentalist must needs have acquired a certain degree of skill or expertise in order to follow and replicate the experiment. The transmission of expertise, in order to say an experiment may be repeated and the results replicated, or rehearsed and the production realised, assumes a function contributing to causality. The idea, sense or problem of the experiment is rehearsed surely only insofar as there is a chance element, a diversion into the aleatory through which all the givens must pass. Such a chance would not simply allow mistakes to happen but would open up the assemblage of parts, in the laboratory, to the generation of unforeseen individuations, whether by mistake or design. Wouldn’t this also mean that the replication of standard results in an experiment, that replicability, requires this chance and invokes it? and that the rehearsal of the same rests on the difference made by the introduction of the arbitrary?