Go to content Go to navigation Go to search

Belinda Barnet

Interview by Simon Mills

SM: I sense some uneasiness in your description of the owners of mobile phones living in a constant state of anticipation, as though these devices distract us from actually living in immediacy in some way. This sense of constant distraction may have just reached its zenith with applications like twitter and messenger apps. What’s your view about how these technologies could be seen as having a directive effect on us, do you fear they are having some kind of detrimental effect on young people who grow up immersed in these always-on networks?

BB: I think mobiles are definitely having an effect on society and on youth culture in particular, but there is little public discussion of this. In Australia, the media is more interested in hysterical stories about mobile phones causing brain tumours or literacy problems in children. The real-world effects of mobile use are more subtle, and they touch all of us – not just kids. I don’t think these effects are detrimental either. As you point out, mobile devices are always on, always connected to the network – and in Australia at least, they are in the pocket of over 96% of the population. Many of these devices are also equipped with cameras and the ability to send and receive images. So for the first time in history, we have a citizenry who are in perpetual contact with the network, who are able to send and receive images wherever they are, who are never ‘offline’ unless they choose to be. If there is any kind of news event, a natural disaster or celebrity sighting for example, then someone is usually there with a camera in their pocket to capture it; perpetual surveillance.

I think this always-on connectivity is having an effect on our psychology as well. When you have a mobile in your pocket, you are always aware of it; you check it for messages several times a day, part of you is always listening for its ring. Mobile users have a constant low-level awareness of their device; the possibility that communication may arrive at any instant inhabits their awareness. It’s like you’re expecting a visitor sometime or have a pot slowly boiling in the other room, your attention is split. Linda Stone coined the phrase "continuous partial attention" to describe what happens when our attention is divided over many tasks at once: TV, computer screen, phone. This is of course symptomatic of a modern work environment, but I think mobile devices exacerbate it because they are always with us.

SM: Your mention of the ubiquity of camera phones also relates to the current buzz around user generated content and Web 2.0 that is peaking at the moment. Web 2.0 seems to be a meme that divides opinion, as does the plethora of other 2.0 spin-offs (E.g. Media Studies 2.0). How do you view this development within the context of the past 10 years of cyberculture?

BB: The term "Web 2.0" is useful to describe some recent trends, but we need to remember it’s a marketing term. It was coined by the online publishers O’Reilly Media to describe a whole grab-bag of different applications and business models. The phrase is often used as though it were revolutionary, but it doesn’t use any new technical standards; it’s the same web it always was. It’s not a "new" web. We’re just using the same technologies to build applications that facilitate social networking, or user participation, or collaborative media production, or whatever is on their list. The definition changes every time they have a conference, so I’m not sure what their current “mind maps” include! It’s a useful term if you work in marketing, PR or business development – but I think academics and media students should be more critical. The idea of Media Studies 2.0 is quite funny; is that where the students collaboratively write the course material and give the lectures to each other? Love it.

When you situate “Web 2.0” in the history of computing it’s not a new thing either. Most of applications identified as Web 2.0 have a long history and existed well before the phrase was coined in 2004; social software, blogging, wikis, podcasting for example. Email is entirely user-driven- and that was around long before O’Reilly’s mind maps and bright bar charts. Even the web development technique they are currently frothing at the mouth over, Ajax, is comprised of technologies that have been around for over a decade. Marketing terms are useful only to a point, and should not be used excessively in serious or critical discussion. Sorry, this term irks me – you probably didn’t want a response like that!

SM:  As someone who has been involved with new media over the past decade I’m interested to find out how your thinking about it has changed in that time. In your previous answer you mention that the net hasn’t fundamentally changed, at least not technically. What has changed, if anything, especially for writers and artists, in that time?

BB: That’s a great question. Speaking for myself, the change is pretty simple; I’ve lost the rose-coloured glasses. In the early 90’s I was convinced that the internet was going to revolutionise literature, art and even education. Many writers and artists were at that time. This initial period of excitement tends to happen with the introduction of every new communications technology – radio, TV, computing, and now mobile devices. People think that radical change is taking place, and that this change is unprecedented. I think it’s a very human response – we want to believe that our era is important. It’s undeniable that the internet has had an impact on society, but it’s not the straightforward democratising effect that many of the hypertext theorists thought it would be.

Over the last ten years, I think the main changes on the net have been social. The most obvious change is the domestication of the technology, an exponential increase in the domestic penetration of the web. My 93 year old grandfather Googles his research topics – put it that way! So although there hasn’t been a massive revolution in the technology itself, we have seen a noticeable shift in the way people use it and the number of people using it. Over the last five years in particular I think the shift in usage has been dramatic; the web is not just a place you go to find information, to do your banking and shopping; it is a place to actively create and publish fragments of your life, to develop a virtual persona in a virtual community, to build stories and objects and then share them. This is what Web 2.0 describes – a basic shift in the way the web is used, and the new business models that accompany that.

I think we’re going to see another shift soon, towards location based social software. So at the moment, if you are using LinkedIn, MySpace or Facebook for example, there is no indication who among your friends is currently physically close to you. Imagine if you could sit in a cafe with your mobile device and know immediately who was within a block or two of you – maybe a friend of yours is in a shop just around the corner. There are already applications that help you find things you are looking for in your immediate vicinity (restaurants, for example). Most mobile applications incorporating location-based services (LBS) are about finding information, which is like the first phase of the web discussed above. I think we’ll be seeing a second phase soon, where location and community converge, which will be really really interesting. For a while there on the web it didn’t matter where you were from, and in a way this will make location important again.

SM: One topic that seems relevant at the moment is the idea that in some way students are changing due to their new media usage and education needs to adapt to this change or it will either lose the students altogether or at least miss out on a vital way of communicating with them. You’ve already mentioned the phenomenon of “continuous partial attention” that networked devices can engender and also made reference to, what I call, the “euphoria/anxiety” response to new technologies. As someone who works within academia are you aware of these changes and do you feel education needs to significantly adapt to technological changes or is this another bubble?

BB: Tertiary education must adapt to the technological changes that are occurring at the moment, from the way that courses are written and delivered to the way that students are assessed. This is not just because we will ‘lose’ students, but because what we teach will become irrelevant. From mobile devices and multiplayer games to weblogs and wikis, the way our students communicate with each other and with their environment is changing, and this change is accelerating. I think it is hard to adapt due to this acceleration. We had 100 years to adapt to radio, 50 to adapt to TV, 10 to adapt to the web, and it feels like 3G happened yesterday! It is difficult to adjust to this pace, particularly for universities; they aren’t designed to move quickly. Universities are designed to preserve knowledge and old ways of doing things; it takes decades, or even centuries, to change.

Although they were reluctant at first, universities had over 500 years to adapt to the technology of print. This created a whole new academic culture, the culture of the book. If we look back to the beginnings of the university in ancient Greece, there was a struggle even then against a new technology – writing. Plato thought this new-fangled device would ‘implant forgetfulness in [men’s] souls’; it would destroy their memory. People in his day were taught to memorise thousands of lines of poetry and long speeches as part of rhetoric, the art of ‘enchanting the soul’. He thought writing was an artificial memory that would eat away at our natural skills. He was right to be worried; we no longer teach or learn the art of memory at university; I couldn’t memorise fifty lines of poetry, let alone ten thousand.

With the introduction of any new technology certain skills and types of knowledge die out. The old skill of going to the library, borrowing a book, reading it and then really contemplating it is dying out. Students don’t read anymore; they graze. They Google it. I think this ability to sift through information like superfine flour is a new and important skill, but the fact is that there is not a lot of contemplating going on. I think this is what some academics are worried about; they are not worried about new technologies being used in teaching and learning (for example, webcasting), but about what we might lose or what might fall by the wayside in the process (for example, attending real live lectures!) I think it helps to focus on the new suite of skills that technologies bring with them. All technologies create cultures of use around themselves; they create new techniques and new ways of doing things that were unthinkable prior to the technology.

I see new types of skills evolving in my students, and in myself. People who have grown up using computers and new media can sift through gigabytes of information quickly and get to the important bits. We are  searching, compiling, summarising and retrieving information. If I am looking for a particular piece of information, say a publication date, I get quite irritated if it takes longer than a few minutes. What a waste of my attention. We’ve already talked about the ability to focus on multiple different things at once – continuous partial attention. Attention is at a premium, and it is spread across many different things at once. Students routinely sit in lectures and take notes on a laptop and send emails and check their phones for messages. There are many more things going on at once than 50 years ago.

SM: I’m interested in how you view the relationship between human beings and technology, and how you think Heideggerians such as
Bernard Stiegler have reconfigured this relationship in contemporary critical theory.

BB: Stiegler is one of the most ground-breaking and original philosophers writing on technology today. He’s influenced my own work on the evolution of technology, but more importantly he has brought the question of technics back into philosophy, and taught us to think differently about the relationship between human beings and their tools. I’m a bit of a fan.

In contemporary critical discourse, there is much discussion about how human societies create and shape technologies, and also how human existence is irreducibly technical (philosophy has known this since Heidegger). This was happening long before the first English translation of Stiegler’s work. In Media Studies, we favour social and linguistic interpretations; as Geert Lovink once put it, one could be forgiven for thinking that discourse on technology begins and ends with a critique of language. At the moment, it’s all about signs and social networks (actually, scrap that, it’s all about Second Life and FaceBook!) The social constructivist or ‘social shaping’ of technology school is also quite strong, for example the actor-network approach of Latour, who you are interested in.

At base though, there is still this assumption that human beings create technics; technics do not pre-exist or constitute the human. It is human beings who design anti-aircraft systems and build mobile phones, and human beings who dream up engineering discourse. It is US government and military interests that lead to the creation of computing, and avant-garde literature that leads to hypertext. Ultimately, it is human societies who shape technical objects; human beings invent technics. For me the interesting part of Stiegler’s work is the idea that it is technics that invents human beings; it is technics that constitutes the human. As a species, we are characterised by our non-adaption, by our relation to technics; our memory is transferred to books, our "strength multiplied in the ox, our fist improved in the hammer" as the French archaeologist Leroi-Gourhan puts it (1993). We are technical beings, and there is no purely human essence or nature unaffected by technics; there is an originary supplementarity. In effect, Stiegler takes human beings off the pedestal in the human-technology relationship; traditionally it has been quite anthropocentric, as though there were a "natural" human species and technics is mere extension of this. This "originary technicity" is a Derridean concept of course, which Stiegler radicalises. Whereas Derrida is concerned to articulate the relationship in terms of a "logic", the logic of difference, Stiegler is concerned to articulate it in terms of its historical differentiations in different technical systems.

So human beings invent themselves within technics, and there is no purely human nature that exists outside of technology. This is not technological determinism, because there is no human essence to be shaped by technology in the first place (the charge of technological determinism assumes there is some basic human "freedom" to defend). This is also not a blissful "coupling" or symbiosis in the sense of Haraway’s cyborg; this is not a metaphor for subject-constitution, or more precisely, this is not a choice. I’ve always found Haraway’s work a little too utopian. Students love it because it has this central trope or image of the cyborg, so it is useful to an extent, but I don’t think it tells us anything about human beings and their tools. The human animal is at base a technical being, and has been from the inception of the species. This is not a metaphor for a new feminist politics, it’s a fact of life.

So I’m interested in the idea that technics invents the human because I’m interested in human evolution, and at base, Stiegler is redefining what it means to be human. In order to do this he synthesises and critiques the work of Heidegger and Derrida, but also draws on work from other disciplines, like paleontology, evolutionary biology and archeology. Admittedly the science is not always right (Stiegler’s interpretation of what epigenetics means, for example, is not what microbiologists mean by the term), but it’s close to the mark. Usually people approach Stiegler through either Heidegger or Derrida, but it would be interesting if we had some work done from a biological or archaeological perspective for example. A couple of years ago I interviewed one of the world’s most accomplished palaeontologists, Professor Niles Eldredge, and put some of these theories to him, including Stiegler’s idea that technical objects contain within themselves a technical lineage. The result was quite interesting – Eldredge has his own exciting theories about how technical artefacts evolve over time (http://journal.fibreculture.org/issue3/issue3_barnet.html, but I don’t think he’ll be rushing out to buy Stiegler’s books anytime soon.

I’m probably also thinking about human life because I’m 38 weeks pregnant! Sure beats ruminating on paint colours for the nursery. I’d better wind up I think.

Since the beginning of time, physicists tell us, the entropy of the universe has been increasing. Matter has a tendency to disintegrate, to lose energy and form over time, to move towards disorder and chaos. As I see it personally, life is about the preservation of form in this flux. One way it does this is through that most primary form of writing – DNA. On another level, we preserve things as a species in artefacts, in language and in culture; in technics. Human beings have always felt compelled to capture fragments of their lives, to store and transmit memories; we have inscribed ourselves in books and on cave walls, in folk songs and on New York subway benches. I think it is one of the most primary reflexes of human life – preserving memories. Technics as a form of memory is also something Stiegler explores in his book, Technics and Time. I think I should stop there, or I’ll rabbit on forever! If you want to read more about human evolution and technology, Niles Eldredge and other interesting bits, see my essay in CTHEORY.


8 – Nov – 2007