Of Epistemic Tools- Musical Instruments as Cognitive Extensions

10
Organised Sound http://journals.cambridge.org/OSO Additional services for Organised Sound: Email alerts: Click here Subscriptions: Click here Commercial reprints: Click here Terms of use : Click here Of Epistemic Tools: musical instruments as cognitive extensions Thor Magnusson Organised Sound / Volume 14 / Issue 02 / August 2009, pp 168 176 DOI: 10.1017/S1355771809000272, Published online: 29 June 2009 Link to this article: http://journals.cambridge.org/abstract_S1355771809000272 How to cite this article: Thor Magnusson (2009). Of Epistemic Tools: musical instruments as cognitive extensions. Organised Sound, 14, pp 168176 doi:10.1017/S1355771809000272 Request Permissions : Click here Downloaded from http://journals.cambridge.org/OSO, IP address: 130.216.158.78 on 24 Jun 2014

description

epistemic tools

Transcript of Of Epistemic Tools- Musical Instruments as Cognitive Extensions

Page 1: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

Organised  Soundhttp://journals.cambridge.org/OSO

Additional  services  for  Organised  Sound:

Email  alerts:  Click  here

Subscriptions:  Click  here

Commercial  reprints:  Click  here

Terms  of  use  :  Click  here

Of  Epistemic  Tools:  musical  instruments  as  cognitive  extensions

Thor  Magnusson

Organised  Sound  /  Volume  14  /  Issue  02  /  August  2009,  pp  168  -­  176

DOI:  10.1017/S1355771809000272,  Published  online:  29  June  2009

Link  to  this  article:  http://journals.cambridge.org/abstract_S1355771809000272

How  to  cite  this  article:Thor  Magnusson  (2009).  Of  Epistemic  Tools:  musical  instruments  as  cognitive  extensions.  Organised  Sound,  14,  pp

168-­176  doi:10.1017/S1355771809000272

Request  Permissions  :  Click  here

Downloaded  from  http://journals.cambridge.org/OSO,  IP  address:  130.216.158.78  on  24  Jun  2014

Page 2: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

Of Epistemic Tools: musical instrumentsas cognitive extensions

THOR MAGNUSSON

ixi audio, Music Informatics Research Lab, Department of Informatics, University of Sussex, BN1 9RH BrightonE-mail: [email protected]

This paper explores the differences in the design andperformance of acoustic and new digital musical instruments,arguing that with the latter there is an increasedencapsulation of musical theory. The point of departure is thephenomenology of musical instruments, which leads to theexploration of designed artefacts as extensions of humancognition – as scaffolding onto which we delegate parts of ourcognitive processes. The paper succinctly emphasises thepronounced epistemic dimension of digital instruments whencompared to acoustic instruments. Through the analysis ofmaterial epistemologies it is possible to describe the digitalinstrument as an epistemic tool: a designed tool with such ahigh degree of symbolic pertinence that it becomes a systemof knowledge and thinking in its own terms. In conclusion, thepaper rounds up the phenomenological and epistemologicalarguments, and points at issues in the design of digitalmusical instruments that are germane due to their strongaesthetic implications for musical culture.

1. INTRODUCTION

The philosopher Don Ihde is well known for hisanalysis of how we establish a relationship with theworld through tool use (Ihde 1979, 1990). Ihde reportson various phenomenological modalities in our rela-tionship with tools, two of which are relevant to thisarticle. For Ihde, the acoustic musical instrument is anillustrative example of a technology that enables anembodiment relationship to the world. The instrumentbecomes an extension of the body, where trainedmusicians are able to express themselves throughincorporated knowledge that is primarily non-conceptual and tacit. The other phenomenologicalmode, the hermeneutic relationship, differs in the sensethat here the instrument is not an extension of thebody, but rather a tool external to the body whoseinformation we have to interpret (thus hermeneutic).This instrument can be seen as a text, something wehave to read in our use of it. Disregarding the perils ofdualism, and acknowledging that these distinct rela-tionships can overlap in the same musical instrument,I would like to propose that many digital instrumentsare to be seen primarily as extensions of the mindrather than the body. This seemingly dichotomousstatement will be explored in section 3 below.

Furthermore, and in relation to the above, I willargue that while acoustic instruments afford a strong

embodiment relationship with the world (or theterminus of our activities – the physical energy ofsound), digital instruments increasingly tend to con-strue us in a hermeneutic relationship with the world.The tangible user interfaces that apparently constitutemany digital musical instruments are but arbitraryperipherals of the instruments’ core – that is, a corethat is essentially a symbolic system of computationaldesign. I contend that the primary body of the digitalinstrument is that of symbolic instructions writtenfor the meta-machine, the computer. As opposed tothe body of the acoustic instrument, the digitalinstrument does not resonate; it contains few latentmysteries, or hidden expressive potential that typi-cally can be derived from the materiality of acousticinstruments (Edens 2005). The functionality of thedigital instrument is always explicitly designed anddetermined. Indeed, the digital musical instrument –especially if it makes use of automation or othermappings that are not one-to-one gesture-to-sound –is constituted by generic, prescriptive and normativesets of rules that affect or direct the musician at thehigh level of musical language (both formal andtheoretical).

The focus of this paper are novel digital musicalinterfaces, in particular those to be found in aresearch field best represented by the NIME (NewInterfaces for Musical Expression) conference series.Although relevant as well, the analysis is not directedat digital pianos or pure studio simulators like Pro-Tools. What is of interest are the computationalmusic systems used to build expressive intelligentinstruments, or composed instruments (Schnell andBattier 2002), where the distinction often blursbetween instrument and composition on the onehand, and performance and composition on theother. Composed instruments typically contain auto-mation of musical patterns (whether blind or intelli-gent) that allow the performer to delegate musicalactions to the instrument itself, such as playingarpeggios, generating rhythms, expressing spatialdimensions as scales (as opposed to pitches), and soon. These systems are therefore split systems betweenthe physical interface and the programmed soundengine. Typically such engines are programmed inenvironments like Pure Data, SuperCollider, ChucK,

Organised Sound 14(2): 168–176 & 2009 Cambridge University Press. Printed in the United Kingdom. doi:10.1017/S1355771809000272

Page 3: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

Max/MSP or Kyma. These tools have such advancedalgorithmic capabilities that there is no need to limitthe instrument to direct one-to-one gesture-to-soundmapping; the instrument can implement variouscomplex (and even adaptive) mapping structures andcontain various degrees of automation from simplelooping to complex artificial intelligence responses.All these computational techniques are impossible inacoustic instruments and their theoretical implica-tions unavoidably involve an explicit systemic repre-sentation of music as a rule-based field or a creativesearch space (Boden 1990).I will define the computational music system as an

epistemic tool, as an instrument (organon) whosedesign, practice and often use are primarily sym-bolic.1 The concept of epistemic tools is not intendedto define exclusively digital technologies; it includesall tools that work as props for symbolic offloadingin our cognitive process. Good examples of symbolicbut analogue machines are the abacus and the astro-labe. In order to clarify this point, this paper proceedsto explore the nature of embodiment as a non-representational or non-symbolic process. It thenlooks at how human cognition can make use ofmaterial supports outside the body, echoing AndyClark’s statement that cognitive processes can ‘extendoutside the head of an individual’ (Clark 1996: 81).The central question then becomes what it meanswhen music is composed and performed with intelli-gent artefacts that are inscribed with a specific musictheoretical outlook.The question of epistemic tools can be posed like

this: if much of our thinking happens ‘in the wild’(Hutchins 1995), external to our body, as a socio-cultural process that uses technology as an externalscaffolding of cognitive processes, and if our learning islargely a process of incorporating knowledge in a non-symbolic way through an enactive relationship withour tools and environment; how do computers, asnecessarily symbolic devices, enable, produce, main-tain, support, augment but also constrain and limit ourcognitive processes and therefore creative output?The aim of this paper is to identify two linked

distinctions that are becoming increasingly apparentin the design and analysis of modern musical instru-ments. Firstly, it examines the different embodiedexperiences available in acoustic and digital in-struments (which is primarily a phenomenological

investigation); this route has been navigated mostoften in the NIME community. Secondly, it exploresthe disparate theoretic and material affordances ofacoustic and digital instruments (here seen as anepistemological enquiry); whilst related to and con-straining the first area, previous work has largelyneglected this perspective. This paper therefore seeksto correct the imbalance between the two identifieddistinctions in the analysis of digital musical instru-ments by addressing the epistemological nature ofour new musical instruments.

2. THE MASTERY OF A MUSICALINSTRUMENT: A SUB-SYMBOLIC SKILLACQUISITION

Twentieth-century cognitivism has not been successfulin portraying human intelligence, and variousapproaches have emerged that propose different viewsof cognition where typically the body and the environ-ment enter the equation.2 One of the most musicallyrelevant approaches is enactivism, as developed byVarela, Thompson and Rosch (1991). This theory, withroots in biology but inspired by phenomenology andEastern thought, depicts the mind as necessarilyembodied in an external environment. In enactivismthere is no distinction between perception and action;both co-emerge through the agent’s active involvementwith the world. This involvement is primarily non-representational – in other words, it is at a level wherethe cognition is subconscious, pre-conceptual, dis-tributive and emergent. Varela et al. define the term:

We propose the term enactive to emphasize the growingconviction that cognition is not the representation of apregiven world by a pregiven mind but is rather the

1As an example we might take Michel Waisvisz’s The Handsinstrument. Every sensor of the complex interface is mapped to aspecific parameter in the software-based sound engine. A change inthe engine will result in a new (or altered) instrument. Although theinterface has not been altered by a change in the mapping algo-rithm, the instrument behaves differently. For Waisvisz, changingthe algorithms that constitute the sound engine means learning anew instrument, which involves the re-incorporation of the con-ceptual understanding of the engine’s functionality into bodilymemory (Waisvisz 1999, 2005).

2Cognitivism is here used as a term that denotes the trend in cog-nitive science from the mid twentieth century to view human cog-nition as primarily symbolic. This resulted in a tradition inArtificial Intelligence now called GOFAI (Good Old FashionedAI). Arguably, the distinct fields of cognitive science and computertechnology symbiotically constituted each other’s rise through thisperiod. The new theories criticising cognitivism include: Con-nectionism, where the idea is to build artificial neural networks thatperform cognitive tasks through non-representational content(McClelland, Rumelhart and the PDP Research Group 1986);Enactivism, which claims that the whole body and the environmentbecomes part of the cognitive function (Varela et al. 1991);Dynamic Systems, a non-representational theory claiming thatcognition (and consciousness) arises as epi-phenomena of theprocess of being in the world (Brooks 1991); Situated Cognition, atheory of knowledge acquisition as situated, being in part a productof the activity, context and culture in which it is developed andused (Brown et al. 1989); Situated Action, an emphasis on theconstitutional context of all action as emergence (Suchman 1987);Activity Theory, where the focus is on human tool use and thecultural and technological mediation of human activity (Bertelsenand Bødker 2003); Embodied Cognition, where thinking and actingis seen as one process, emphasising our situatedness, and claimingthat our cognitive system emerges from interaction with the world(Anderson 2003); and Distributed Cognition, where cognition isseen as an interaction between human and artefacts, and emphasisis laid on the social nature of human existence (Hutchins 1995).

Of Epistemic Tools 169

Page 4: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

enactment of a world and a mind on the basis of ahistory of the variety of actions that a being in the worldperforms. (Varela et al. 1991: 9)

Importantly, this does not mean that humans arenot symbolic cognisers. That fact is obvious bylooking at how humans constantly create and usesymbolic systems such as language, mathematics andmusic, or games such as chess. Stressing that they ‘seesymbols as a higher-level description of propertiesthat are ultimately embedded in an underlying dis-tributed system’ (Varela et al. 1991: 101), Varela et al.find it more appropriate to define symbols (or cog-nitive representations) as ‘approximate macroleveldescriptions of operations whose governing principlesreside at a subsymbolic level’ (Varela et al. 1991: 102).

Varela et al. describe embodiment as an enactivestance towards the world where we create our (not ‘the’)world through an active engagement with it. Enactivismis resourceful in explanations of embodiment, some-thing we should bear in mind when analysing howacoustic musicians incorporate knowledge of theirinstruments through repeated practice. It also explainshow symbolic systems are higher-level descriptions ofphenomena understood by the agent through bodilyperceptions. There is a general consensus in cognitivescience that musicians (or athletes for that matter) learntheir skills gradually through persistent practice and aminimum of verbal instructions (Dreyfus and Dreyfus1986). Such bodily incorporation transforms our toolsinto ready-at-hand phenomena (Heidegger 1962) thatare not based on symbolic or theoretical relations anymore. The focus becomes the act and not the object, or,as Winograd and Flores explain in a Heideggerianmanner, ‘[m]y ability to act comes from my familiaritywith hammering, not my knowledge of a hammer’(Winograd and Flores 1986: 33).

According to the enactive view, the skill acquisitionrelated to learning an acoustic instrument is highlyembodied, non-symbolic and perceptuo-motorbased.3 It explains how Ihde’s ‘embodiment relations’to the world are established, and provides a descrip-tion of this process of incorporation4 from the level ofbiology. An important research question herebecomes to explore, together with the enquiry intothe epistemological or music theoretical nature ofdigital tools, how the digital musical instrumentmanifests a different relationship between the humanbody and the body of the instrument itself through itscharacteristic split between the interface and thesound engine.

3. THE EXTENDED MIND

In the 1990s, working in another strand of cognitivescience from that of Varela et al., Andy Clarkdeveloped his theory of the ‘extended mind’. Clarkillustrates how people use props in the environmentto extend their cognitive capacity and ease cognitiveload. Sticky notes, notebooks, diagrams, models, andso on all serve as scaffoldings onto which we ‘offload’our cognition. It should be noted that this is not somekind of mystical panpsychism, as Clark and Chalmersadamantly point out that they do not equate thecognitive process with consciousness (Clark andChalmers 1998). The cognitive process happens bothinside and outside the skull, an observation reminis-cent of Wittgenstein’s account of thinking as physicalactivity (Wittgenstein 1969: 6).

[In certain conditions, t]he human organism is linkedwith an external entity in a two-way interaction, creatinga coupled system that can be seen as a cognitive system inits own right. All the components in the system play anactive causal role, and they jointly govern behavior inthe same sort of way that cognition usually does. If weremove the external component the system’s behavioralcompetence will drop, just as it would if we removedpart of its brain. Our thesis is that this sort of coupledprocess counts equally well as a cognitive process,whether or not it is wholly in the head. (Clark andChalmers 1998: 7)

Objects and artefacts serve as an external playgroundfor thinking. We calculate on paper, draw graphs,build models, make reminders of all kinds, and, asClark illustrates, we reorganise the pieces whenplaying Scrabble. According to this view, it does notmatter whether information is stored and processedin the brain or outside in the environment, whatmatters is how the data is retrieved and taken into use(Clark 2003).

While the theory of extended mind has muchpotential for interesting discussions in cognitive sci-ence, it is pertinent, in the context of musical tech-nology, to probe deeper. Indeed, the picture is moreintricate since we are surrounded by objects andtechnologies (or ‘thrown’ into world of equipment asHeidegger would describe it) that contain complexmaterial properties, scripts of their usage (Latour1994), and even politics (Winner 1980). The objectsaround us – the technologies that serve as props inour thinking and music making – are stuffed withpeople and their ideas; in them we find programmesof action, manuals of behaviour, and political andsociocultural constructions, including aesthetic ten-dencies. Technological objects are therefore neverneutral, they contain scripts that we subscribe to orreject according to our ideological constitution. Theproblem is that the scripts are often well hidden andconcealed, which can result in an uncritical use of

3Useful accounts of the non-symbolic nature of learning are DavidSudnow’s ethnographic account of learning to play the jazz piano(Sudnow 2001) or Dreyfus and Dreyfus’ analysis of the five stagesof skill acquisition (Dreyfus and Dreyfus 1986).4On the process of incorporation in the context of embodiment, seeHayles (1999).

170 Thor Magnusson

Page 5: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

creative technologies – technologies inherent withideological content.To illustrate this point in the field of musical

instruments, we see that the piano keyboard ‘tells us’that microtonality is of little importance (and muchWestern music theory has wholly subscribed to thatscript); the drum-sequencer that 4/4 rhythms andsemiquavers are more natural than other types; andthe digital audio workstation, through its affordancesof copying, pasting and looping, assures us that itis perfectly normal to repeat the same short perfor-mance over and over in the same track. It is thereforegermane to ask ourselves how much theory isinscribed into our new tools, particularly with regardto how automated, structurally complex and aiding(in composition and performance) our new musicalinstruments have become.

4. MATERIAL EPISTEMOLOGIES

Things are impregnated with thoughts; they areembedded with ideologies that have ethical, politicaland aesthetical implications. Furthermore, we usethose artefacts in our daily tasks as extensions of ourcognitive mechanism. But how do things ‘contain’knowledge? How do we write our knowledge intoartefacts, and how do we read that knowledge fromthem? By the same token, how does this relate todigital musical instruments?Davis Baird (2004) shows how material objects can

have a different epistemological status to statementsof language. Technological objects may containfunctions which their users understand, but wouldnot be able to describe in language as ‘[t]he materialproducts are constitutive of scientific knowledge in amanner different from theory, and not simply‘‘instrumental to’’ theory’ (Baird 2004: 1). Much likeHeidegger and Ihde, Baird shows that technologicalobjects can precede science, and afford scientific dis-coveries through their physical structure and func-tionality. The point here is not what precedes what,but rather that the instrument becomes an expressionin itself, an externalisation of knowledge in a formthat is not symbolic but material. Additionally, it isnot only designed objects that inhere knowledge, butnatural objects as well, by means of the physical andmechanistic properties of their material.

Knowledge can be expressed in many ways. Theoriesexpress knowledge through the descriptive and argu-mentative functions of language. Instruments expressknowledge both through the representational possibi-lities that materials offer and through the instrumentalfunctions they deploy. (Baird 2004: 131)

Baird is well aware of Bruno Latour’s ideas ofconcretisation: namely, that when many elementscombine into one actor and start to operate as a unity

(with either human or non-human agency), theygradually become a black box. When the black boxworks, its origins are forgotten and thus ‘para-doxically, the more science and technology succeed,the more opaque and obscure they become’ (Latour1999: 304). Baird agrees with Latour on the nature ofscientific blackboxing, but highlights another andperhaps more epistemologically active function of theblack box itself: while talking about a particularinstrument, called Spectromet, Baird says

[t]he knowledge used in this context is tacit in the sensethat those using the instrument (typically) could notarticulate the understanding of spectrochemestry theydeploy in doing so. Nonetheless, they can use it. Thisspectrochemical knowledge has become detached. It hasgone inside – inside the instrument – and can now tacitlyserve other technical and scientific purposes. (Baird2004: 163)

Here (and spectacularly exemplified in the digitalmusical system) we see how the blackboxed instrumentcontains the knowledge of its inventors, which meansthat the users of the instrument do not need to have adeep knowledge of its internal functions. If we assumethat both the designers and the users of the instrumenthave an understanding of it, this understanding is verydifferent and attained from distinct origins. The for-mer creates the instrument from a conceptual under-standing of the domain encapsulated by it, whereasthe latter gains operational knowledge that emergesthrough use (or habituation) and not from abstractunderstanding of the internal functionality. This pic-ture is particularly complex in today’s new musicalinterfaces as typically their designers are also the per-formers. This implies a continuous oscillation betweena mode of conceptual (system design) engagement withthe instrument and embodied (performative) relation-ship with it. Again, we are reminded of Waisvisz’sstance introduced above, where we might talk of twomodes: that of the instrument designer and theinstrument player.

5. THE ACOUSTIC, THE ELECTRICAND THE DIGITAL: INTERFACES OFA DIFFERENT KIND

When the technological artefact is made of materialsubstrata (as opposed to symbolic), it can containknowledge that precedes the scientific understandingof its functioning. The acoustic instrument is a goodexample. The sophisticated sound of the clarinet orthe cello was developed over an extended period oftime, but this sound was mature long before Fourieror Helmholtz brought forth their theories of sinu-soidal functions and timbre. Strings, wood and brasstacitly encompass the theories of sound in theirmateriality. The observation here is that acoustic

Of Epistemic Tools 171

Page 6: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

instruments inhere knowledge that has to be explicitlyunderstood and stated in the design of digitalinstruments; without this explicit knowledge no codecould ever be written. We explore and discover thesonic properties of wood and strings, but a solidtheoretical knowledge of sound is required in order tocreate digital musical systems.

Let us briefly explore some main differences in thedesign of acoustic, electric and digital instruments.The acoustic instrument builders will acquire theirskills through the embodied practice of making theinstruments. These skills, including the intuition ofmaterial properties and sound physics, are largely inthe form of tacit knowledge (Polanyi 1966) acquiredthrough apprenticeship. It is a predominantly em-bodied and non-theoretical knowledge derived fromdiscovery, exploration and refinement. The develop-ment of the acoustic instrument involves iterationsof designing, building and testing. The instrumentmakers work with what physical materials afford interms of acoustics; a change in curve, in width, or inmaterial means a change in sonic timbre. The inter-face is necessarily conditioned by the properties of thematerials used, and various solutions are thereforeintroduced to adapt to or extend the scope of thehuman body with the use of levers, keys and othermechanisms. Acoustic instruments are thereforefirmly grounded and conditioned by human physiquein their design.

The makers of electronic instruments, such as theearly synthesisers, work with different materials. In-stead of millennia of tradition, they start with explor-ing the material properties of electricity, magneticwaves, oscillators, capacitors, inductors and transistors.As with the acoustic instrument, the creative process isone of designing, building and testing through a cycle ofiterations, actively working with the materials. How-ever, the materials of the electronic instrument makerscome with instructions and schematic diagrams thatdescribe their behaviour. There is an increased logic ofcalculation, science and engineering. Fourier’s andHelmholtz’ theories are now well known, and theinstrument makers can draw from that knowledge intheir designs of oscillators and filters. However, some ofthe characteristic sound in electronic instrumentsdepends on the chaotic or entropic properties of thematerials used.5 The user interfaces in electronic musi-cal instruments can be built in any shape and form. Interms of ergonomics, we are still constrained by physi-cal mapping, so when the instrument has been wiredup, its fundamental functionality is not easily changed.

This gives the machine an instrumental quality, acharacter that affords in-depth explorations.

The digital instrumentmakers are in a different worldaltogether. The ‘workshop’ is the meta-machine of thecomputer, and inspecting them at work might notsuggest any associations with musical activities.6 Codeas material is not musical; it does not vibrate; it ismerely a set of instructions turned into binary infor-mation converted to an analogue electronic current inthe computer’s soundcard. The materials of the digitalinstrument are many: a computer, a monitor, a soundcard, an amplifier, speakers, and tangible user inter-faces. Behind this material surface lie other materials:audio programming languages, digital signal proces-sing, operating systems, mapping mechanisms betweengestures and sound engines, and so on. From the per-spective of Latour’s actor-network theory, the net-works enrolled in the production of digital instrumentsare practically infinite. There is an impenetrableincrease in complexity, which means that the inventorshave to constantly rely on black boxes. Furthermore,the materials used in the digital instrument originatefrom technoscientific knowledge. There are relativelyfew physical material properties at play (although ofcourse at the machine level we find matter) comparedto the amount of code that constitutes its internal (andsymbolic) machinery. The inventors have knowledgeabout digital signal processing, sound physics, audiosynthesis, gesture recognition, human–machine inter-action, and the culture of musical performance. Ingeneral, the digital instrument is based on the knowl-edge of symbolic systems and their essence in the formof code. From a design perspective, any interface canbe designed for any sound. There is no natural map-ping between gesture and sound in digital systems. Inacoustic instruments the performer yields physicalforce to drive the instrument, whereas in electricinstruments there can be mixture of both physical andelectric force. In digital instruments, the physicalforce becomes virtual force; it can be mapped fromforce-sensitive input devices to parameters in thesound engine, but that mapping is always arbitrary(and on a continuous scale of complexity), asopposed to what happens in physical mechanisms.

Therefore, from the perspective of embodimentrelations, there is a characteristic diversity in the way wework with the materials that constitute our musicalinstruments. Digital music systems, whose foundationsare essentially of a symbolic nature, are more likely toestablish hermeneutic relations to the world thanacoustic instruments, where understanding of expressive

5A good example is how the oscillators in the Moog synthesisersdetune, resulting in a full and pleasant chorus-like sound. In latersynthesisers, like the ARP, it was discovered that by placing theoscillators on the same material, next to each other, they wouldhave the same temperature and therefore not go out of tune asmuch as in the Moog (Pinch and Trocco 2002).

6There might not be any keyboards, notes or indeed sounds in theair – a fact that apparently prompted Bob Moog to attach a key-board to his analogue synths, as, when he was photographed in hisstudio, people would be more likely to relate this new technology ofknobs and wires to music (Pinch and Trocco 2002).

172 Thor Magnusson

Page 7: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

affordances are incorporated into the performer’smotor memory. To work with symbolic tools meansthat one has to continually switch modes fromfocusing on the world to focusing on the tool withregular intervals and to a more pronounced degreethan in acoustic instruments. This detachment fromthe terminus of our activities could be paraphrased asa disruption in flow and is present in the majority ofexisting digital music systems.

6. EPISTEMIC TOOLS

Anyone who speaks more than one language, in parti-cular if those languages are of different linguisticfamilies, knows how differently each language por-trays the world. A language is a world-view. Thisobservation is the basis of a theory called the Sapir–Whorf hypothesis, which states that different lan-guages portray both space and time differently.7 TheSapir–Whorf hypothesis claims that grammaticalcategories of the language determine the speaker’sbehaviour: ‘We cut nature up, organize it into con-cepts, and ascribe significances as we do, largelybecause we are parties to an agreement to organize itin this way – an agreement that holds throughout ourspeech community and is codified in the patterns ofour language’ (Whorf 1956: 213). Seventy years earlier,Nietzsche had argued that physics were an inter-pretation of the world and not its explanation. Thesciences reject phenomenological experience in favourof objective language and they ‘do this by means of thepale, cool, gray, conceptual nets which they threw overthe colourful confusion of sense, the rabble of thesenses’ (Nietzsche 2004: y14). Rather more recently,Bowker and Star (2000) have argued that there arecertain problems in the design of computer systemsthat relate to the act of classification. When we classifyand categorise our external world, in order to repre-sent it through the machinery of information tech-nology, we are inevitably performing acts of reduction.Such reductions are bound to be contingent andmessy, with abundance of ethical and political impli-cations. Human actions are displaced into repre-sentation, thus establishing strata of complexities andinterdependencies that limit the agent.This nature of categorisation and abstraction of

human knowledge (both know-that and know-how)and actions is the essence of computer software, afact that is vividly apparent in the realm of musicalsoftware tools. Software is, as Bowker and Star pointout, a ‘contingent’ and ‘messy’ classification that hasdiverse implications and, in our case, aesthetic, cul-tural and culture-political effects. As Latour (1987) so

elegantly demonstrates, objects establish themselvesas black boxes through repeated use, and in thatprocess their origins disappear. The object is natur-alised8 through heavy use, a fact that makes Bowkerand Star observe that ‘[t]he more naturalized theobject becomes, the more unquestioning the rela-tionship of the community to it; the more invisible thecontingent and historical circumstances of its birth,the more it sinks into the community’s routinelyforgotten memory’ (Bowker and Star 2000: 299).

Writing digital musical interfaces therefore neces-sarily entails the encapsulation of a specific musicaloutlook. It is a (sub-)culturally conditioned system-atisation of musical material. Musical patterns,musical styles and musical aesthetics become black-boxed in software. Most people do not know why thestandards, implementations, patterns or solutions inmusical software are there. At times, they manifestlylimit the musical expression, but at other times lim-itations are concealed by the rhetoric, streamlinedfunctionality and slick interface design of the soft-ware tool. Here we encounter yet another differencebetween the worlds of acoustic and digital instrumentmaking. The acoustic instrument maker can freely re-invent or improve the instrument at any time withnew materials, changing structures or adding/deletingfeatures (Eldredge and Temkin 2007). The softwareinstrument maker is limited by the complex infra-structure of operating systems, programming lan-guages, protocols and interface limitations. In digitalinstruments, systems of classification form an orga-nisation of musical language, a clarification andexplicit elucidation of the musical language and itsrules. However, and well known since Wittgenstein’slater writings, languages are not easily formalised assystems. At the core of all languages are dynamic,emergent and adaptive sets of rules that changethrough time and differ in geographical locations orin sub-cultures. The act of formalising is thereforealways an act of fossilisation. As opposed to theacoustic instrument maker, the designer of the com-posed digital instrument frames affordances throughsymbolic design, thereby creating a snapshot ofmusical theory, freezing musical culture in time. Thedigital instrument is thus more likely to contain anexpressive closure as contrasted with the explorativeopenings of the acoustic instrument.

From our analysis of the epistemic tools, we cannow crudely generalise (and therefore exaggerate forthe sake of an argument) a core difference betweenthe acoustic and the digital. The acoustic instrument ismaterial and developed from bottom-up explorationof the acoustic properties of the materials used. Here,the sound generation (and the required knowledge

7This hypothesis has many followers and critics. A notable criticismwas voiced by Stephen Pinker (1994), who, in a Chomskyanmanner, argues for a universal underlying structure of language. 8Or ‘concretisised’ or ‘pointilised’ in actor-network theory lingo.

Of Epistemic Tools 173

Page 8: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

of it) is given to us for free by nature. We had com-plex instruments in the form of lutes, flutes andorgans long before we ever had a mathematicalunderstanding of sound. The physics of wood, stringsand vibrating membranes were there to be exploredand not invented. However, using the acousticinstrument we develop an intuitive understanding ofthe instrument, the music theory, and the tradition inwhich we are playing. As the interaction with theinstrument is primarily embodied, we have to train ourbody to be in rhythm with the instrument – in otherwords to incorporate the musical knowledge in sync (ormutual resonance) with the particular instrument ofchoice. As opposed to the generic explicitness of thedigital instrument, the acoustic instrument contains aboundless scope for exploration as its material char-acter contains a myriad ways for instrumental entropy,or ‘chaotic’, non-linear behaviour that cannot bemapped and often differs even in the same type(brand and model) of instruments.

The digital instrument is theoretical and developedfrom a largely top-down methodology. In order tomake the instrument, we need to know precisely theprogramming language, the DSP theory, the synthesistheory, generative algorithms and musical theory, andhave a good knowledge of Human Computer Interac-tion. To use the instrument, on the other hand, we learnthe tool through working with it and habituate our-selves with its functions, progressively building anunderstanding of its workings. As users we often do notneed to know as much about synthesis or music theory,as the black box is intended to be used through itssimple and user-friendly interface. Ergonomically, theinteraction happens primarily through a symbolicchannel, which gradually teaches the user to operatewith technical terms (such as ‘low-pass’), but this hap-pens from the habituation of the model (what we callthe epistemic tool). The predefined quality of the digitalinstrument means that its functionality can be exhaus-tively described in a user manual; all is supposed to beexplicit, nothing covert. Where the digital instrumentexhibits any chaotic or entropical behaviour, it tends tobe due to a failure in design, a bug in the code or loosewiring in the hardware.

Let us now reverse out of the epistemologi-cal investigations and revisit the phenomenologicalpoint: it has been explored above how people learnacoustic instruments through an enactive and embo-died practice. Digital music systems or instruments(such as Logic, Pure Data, The Hands, or thereacTable) pose a difficulty here. They do, of course,provide for a certain virtual embodiment, but the locusof their real nature is fundamentally in the symbolicrealm. Although we interact in an embodied mannerwith the computer using physical interfaces (movingour mouse on a two-dimensional plane, touchingscreens, or swinging Wiimotes) the interaction always

takes place through symbolic channels of varied band-widths. The interaction is primarily with graphicalrepresentations such as words, menus, cables or iconson the one hand, or abstract representations such asvariables, parameters and functions on the other. Thissymbolic communication is based on a designed con-struction, which is inherently an approximation and anad hoc representation of the task/world/music. Whatcharacterises the player of the digital instrument is amental representation of the instrument’s parameters,and a strong awareness of how easily and arbitrarilythose can be changed. As such, the digital instrumen-talist is always a luthier (Jorda 2005) as well, someonethat consciously engages with the instrument as adynamic and fluid tool of a contingent nature.

The constraints in acoustic instruments are of adifferent kind: they have naturally inbuilt affor-dances, but those are mechanistic and physical(typically the domain of frequency range and timbre),not musicotheoretical and symbolic (typically thedomain of notes and form). The acoustic instrumentpresents itself as a coherent whole where the interfaceand the sound engine are one and the same. Althoughthe digital instrument can be conceived of as anintegrated whole (Bongers 2000; Jorda 2005), this isoften done for the sake of practicality and is notthe only approach that should be taken when they areanalysed. Particularly in intelligent instruments wefind that the expressive design and the determinant ofperformance experience is to be located at the sym-bolic computational level. This area, abounding withculture-specific models of music, has been largelyneglected in the analysis of digital musical systems.

The abstract characteristic of all system designconstitutes the epistemic nature of digital tools. Thetool is designed, its affordances and constraints areoutlined, and the user’s actions are predicted anddelineated into the instrument’s interface and inter-action design. It is the designer who decides with clearrational arguments what is revealed and what isconcealed in the use of the system. Whereas the bodyof the acoustic instrument is physical, the body of thedigital musical instrument is intrinsically theoretical.Skill acquisition, the path to mastery, and the natureof virtuosity are all features that are transformed withthe digital musical instrument. From this perspective,when computers are seen as mediation, the role of thesystem designer is that of outlining a system thatdirects the energy from the physical interface to thework (or the terminus – the music) through thesymbolic engine of the epistemic tool. This activity ofblackboxing, of creating abstractions of activitieswhere bodily movements and thoughts are repre-sented as discrete chunks in time, grounds the com-plexity and the non-transparency of digital tools.Therefore, if there is a normative message in thispaper, it would be an encouragement to acknowledge

174 Thor Magnusson

Page 9: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

the theoretical, cultural and social context in whichall tools are designed – and this implies an awarenessby designers of the responsibilities they have in termsof aesthetic and cultural influence.

7. CONCLUSION

The journey from the phenomenology to the episte-mology of musical instruments took the followingform in this paper: first it introduced Ihde’s pheno-menological modalities of tools. It was proposed thatwe could define, with some rough generalisations,the acoustic instrument as a channel for embodimentrelation with the world, and the digital instrument asmedium for a hermeneutic relation. Through theanalysis of enactivism, the highly embodied nature ofskill acquisition and performance of acoustic instru-ments was identified, but that thread was abruptlydeferred in order to introduce the epistemologicalfundament of designed artefacts. An exploration ofthe extended mind helped to clarify how thingsexternal to the body can become part of its cognitivemechanism, thus rendering technology as integralelement in musical creativity. The thread of embodi-ment was picked up again by exploring the differentphenomenological relationship we have with acousticand digital instruments, due to the distinct nature oftheir interfaces, and this divergence was describedwith regards to the conceptual and system designcomplexity integral to all digital systems.Consequently, the digital instrument was defined

as an epistemic tool (a conveyor of knowledge used byan extended mind) and its symbolic nature wasdescribed as a designed artefact that affords cognitiveoffloading by the thinker or the performer. Althoughacoustic instruments may contain epistemic dimen-sions as well, a diverging factor in acoustic and digitalinstruments is the difference in mapping between agesture that affects real vibrating material, on the onehand, and an action that is arbitrarily mapped to asymbolic system, on the other. I pointed at a funda-mental difference of the acoustic versus the digital inthat, although both inhere knowledge of acoustics,the latter is typically designed from the top-downactivity of classifications (of sounds, gestures andmusical patterns), where nothing is given for free bynature. The digital instrument is an artefact primarilybased on rational foundations, and, as a tool yield-ing hermeneutic relations, it is characterised by itsorigins in a specific culture. This portrayal highlightsthe strengthened responsibilities on the designersof digital tools, in terms of aesthetics and culturalinfluence, as they are more symbolic and of com-positional pertinence than our physical tools.This paper has focused on differences at the cost of

similarities, and divided into distinct groups phe-nomena that are best placed on a continuum. This

has been done, not in order to claim preference forone type or the other, but for the sake of creating anawareness, as creators and users of instruments, ofhow much intelligence our instruments contain; fromwhere this knowledge derives, and at what level itresides. The project of exploring music technologiesin this manner thus becomes a truly philosophical,aesthetic and ergonomic investigation that can ben-efit from the use of a historical genealogy (as pro-posed by Nietzsche and practised by Foucault) asmethod. The question ‘why does a particular musicsound like it does?’ can this way be transposed into aquestioning of the conditions in which musiciansmake music, which instruments are used, and thecomplex origins of those. When music making hasbecome a process largely taking place through digitaltools, we should bear in mind that software hasagency and necessarily inheres more cultural specifi-cations than any acoustic instrument.

8. ACKNOWLEDGEMENTS

The author would like to thank Enrike HurtadoMendieta for collaborating in the generation of theseideas over the years. Nick Collins, Chris Thornton,Alex McLean and Tom Ottway read the paper andmade important suggestions. Finally, the authorexpresses his gratitude to the three anonymousreviewers and the editors of Organised Sound whosecomments made this text much better than it other-wise would have been.

REFERENCES

Anderson, M. L. 2003. Embodied Cognition: A FieldGuide. Artificial Intelligence 149: 91–130.

Baird, D. 2004. Thing Knowledge: A Philosophy of ScientificInstruments. Berkeley: University of California Press.

Bertelsen, O. and Bødker, S. 2003. Activity Theory. In JohnCarroll (ed.) HCI Models, Theories and Frameworks.San Francisco: Morgan Kaufmann, 291–324.

Boden, M. A. 1990. The Creative Mind: Myths andMechanisms. London: Wiedenfeld & Nicolson.

Bongers, B. 2000. Physical Interfaces in the Electronic Arts.Interaction Theory and Interfacing Techniques for Real-time Performance. In M. Wanderley and M. Battier(eds.) Trends in Gestural Control of Music. Paris:IRCAM – Centre Pompidou.

Bowker, G. C. and Star, S. L. 2000. Sorting Things Out: Classi-fication and its Consequences. Cambridge, MA: MIT Press.

Brooks, R. A. 1991. Intelligence Without Representation.Artificial Intelligence Journal 47: 139–59.

Brown, J. S., Collins, A. and Duguid, P. 1989. SituatedCognition and the Culture of Learning. EducationalResearcher 18(1): 32–42.

Clark, A. 1996. Being There. Cambridge, MA: MIT Press.Clark, A. 2003. Natural-Born Cyborgs: Minds, Technolo-

gies, and the Future of Human Intelligence. Oxford:Oxford University Press.

Of Epistemic Tools 175

Page 10: Of Epistemic Tools- Musical Instruments as Cognitive Extensions

Clark, A. and Chalmers, D. 1998. The Extended Mind.Analysis 58(1): 7–19.

Dreyfus, H. L. and Dreyfus, S. 1986. Mind over Machine:The Power of Human Intuition and Expertise in the Eraof the Computer. New York: Free Press.

Edens, A. 2005. Sound Ideas: Music, Machines, andExperience. Minneapolis: University of MinnesotaPress.

Eldredge, N. and Temkin, I. 2007. Phylogenetics andMaterial Cultural Evolution. Current Anthropology48(11): 146–53.

Hayles, N. K. 1999. How We Became Posthuman. Chicago:University of Chicago Press.

Heidegger, M. 1962. Being and Time. Oxford: BlackwellPublishers.

Hutchins, E. 1995. Cognition in the Wild. Cambridge, MA:MIT Press.

Ihde, D. 1979. Technics and Praxis. Dordrech: D. ReidelPublishing Company.

Ihde, D. 1990. Technology and the Lifeworld: From Gardento Earth. Bloomington: Indiana University Press.

Jorda, S. 2005. Digital Lutherie: Crafting Musical Compu-ters for New Musics’ Performance and Improvisation.PhD thesis, University of Pompeu Fabra, Barcelona.

Latour, B. 1987. Science in Action: How to FollowScientists and Engineers Through Society. Cambridge,MA: Harvard University Press.

Latour, B. 1994. On Technical Mediation – Philosophy,Sociology, Genealogy. Common Knowledge 3(2): 29–64.

Latour, B. 1999. Pandora’s Hope. Essays on the Reality ofScience Studies. Cambridge, MA: Harvard UniversityPress.

McClelland, J. L., Rumelhart, D. E. and the PDP ResearchGroup. 1986. Parallel Distributed Processing: Explora-tions in the Microstructure of Cognition. Volume 1 and 2:Foundations. Cambridge, MA: MIT Press.

Nietzsche, F. 2004. Beyond Good and Evil: Prelude to thePhilosophy of the Future. Cambridge: Cambridge Uni-versity Press.

Pinch, T. and Trocco, F. 2002. Analog Days: The Inventionand Impact of the Moog Synthesizer. Cambridge, MA:Harvard University Press.

Pinker, S. 1994. The Language Instinct. Penguin: London.Polanyi, M. 1966. The Tacit Dimension. Garden City, NY:

Doubleday & Company.Schnell, N. and Battier, M. 2002. Introducing Composed

Instruments, Technical and Musicological Implications.NIME 2002 Proceedings. Limerick: University ofLimerick, Department of Computer Science and Infor-mation Systems.

Suchman, L. A. 1987. Plans and Situated Actions: TheProblem of Human–Machine Communication. Cambridge:Cambridge University Press.

Sudnow, D. 2001. Ways of the Hands. Cambridge, MA:MIT Press.

Varela, F., Thompson, E. and Rosch, E. 1991. The Embo-died Mind. Cambridge, MA: MIT Press.

Waisvisz, M. 1999. Gestural Round Table. STEIM Writ-ings. http://www.steim.org/steim/texts.php?id54 (accessedDecember 2008).

Waisvisz, M. 2005. Personal Communication whilst artist-in-resident at STEIM.

Whorf, B. 1956. Language, Thought, and Reality: SelectedWritings of Benjamin Lee Whorf, ed. John Carroll.Cambridge, MA: MIT Press.

Winner, L. 1980. Do Artifacts Have Politics? Dædalus:Journal of the American Academy of Arts and Sciences109(1) (Winter): 121–36.

Winograd, T. and Flores, F. 1986. Understanding Compu-ters and Cognition. Norwood NJ: Ablex.

Wittgenstein, L. 1969. The Blue and Brown Books. Oxford:Blackwell Publishers.

176 Thor Magnusson