Nowadays whoever manoeuvres anything onto his computer via an Internet address - something that informs, makes a noise, glitters or squeals - no longer assumes that there is a technical separation between data and their processing in his machine. The illusion of a transparent or even reversible decoupling of information and the (universal) medium computer has been abandoned in the meantime by every (p)c-user who, in the daily update struggle of proprietary file format, is no longer in command of that which he yesterday so innocently typed in.
It was probably the confirmed Unix-gurus, mainly of the University of California, Berkeley, who suffered most when they had to abandon this illusion when the state monopoly of AT & T Bell Labs was split up, smoothing the way for the commercial marketing of Unix.2 The source-code, which can be read by the user, and the binary machine code separate irreversibly during the compiling process. By withholding the source-code and releasing only the binary code an extremely efficient safeguard for programming knowledge is achieved. The user who usually entrusts himself and his data to Redmond's appropriately named folder "personal files" can certainly not evade that which is in general valid for software. Or alternatively, to be system theoretically precise: Medial objects - whether female celebrities or simple text - constitute themselves in and through communication and not the other way round.3
Now what does this mean for a university discipline like Cultural Studies, which at once observes the course of media development and correlations (Media Theory) and considers how it would like to (re-)present itself in the medium of the Internet (Media Practice)? The student of Cultural Studies realises that any Internet representation can be, at most, a presentation and that his work on the seemingly static-transparent HTML-code front is suggestive of something which has long been a figment of the imagination: information retrieval. In the meantime the Internet was able to establish itself as the technical myth of the century: the rough collection of computers of the W3 connected to each other via a simple (opto-) electric cable mutated into the virtual net.
First-generation websites did not take this virtuality too seriously. Hitherto the Institute for Cultural Studies at the Humboldt University in Berlin was largely content simply to document its theory competence in matters of the New Media - as well as the "usual" content, of course. Thus the website
provides the following kinds of questions as well as concrete answers in some cases. What is represented in the digital media and in what way: human business or knowledge? How is the relationship between archive and storage in the computer defined? What connection do the basic cultural techniques of image, letter and number have to each other in the computer? How realistic is it to differentiate between digital and analog only on the material side of information processing? How can the computer be analysed and discussed as a technical medium as well as a cultural practice? And so on.
However, these reflections on the media are necessarily self-defeating as they move through the undergrowth of the HTML-code as if they were dealing with pen and paper. To a certain extent such behaviour seemed legitimate and sensible, yet the Internet would not be the Internet if it did not express its medial conditions less ambiguously than analog media. From then on, since the pages of the Institute for Cultural Studies had by then reached such a size that an up-date of "Existing" could only be realised at the cost of less "New", the danger of the whole site being out-of-date became apparent. In other words, as of a certain complexity, simultaneous accessibility to all existing "Contents" required necessarily that these were not only statically compiled and available, but also that they interacted flexibly with actual access activity.
Demands like these under the sign of virtuality eventually lead to advanced Net-art actions like knowbotic research
or those of the V2 Organisation
which show most impressively how "fluid" the new media are. Yet to what extent can potentialities of this sort be opened up under the conditions of a university course such as Cultural Studies, which aims to couple Theory Competence with Media Practice? We think that such keywords as low tech, vanishing interfaces, or user-friendly point in the wrong direction. Uncontrolled clicking of the mouse until it works will never open up or describe the limits and potential of a (medial) system, let alone adjust these limits.
In fact - and this result cannot be ignored - the graphical user interfaces which save the user the apparent detour via command-lines and source-codes dominate daily interaction with the computer medium ever more. The advantages of such a "separation of form and content", as the software producers signify (as effective for their advertising as in our association), are obvious. The interaction with digital data-material (contents or assets) is accelerated enormously in that the author no longer has to be programmer, designer, and editor at the same time. However, the reduction of the complexity of these interfaces is not (yet) able to conceal another aspect,4 namely that each form of differentiation, whether between "form" and "content", code and content or the like, is exactly as meaning-generating as the difference between zero and one.
Thus, a possible answer to the central question concerning the medial competence of Cultural Studies has been approached. The linking of theoretical and practical media-competence above all has to confront the mechanism of meaning generation in the computer itself. That means on the one hand to thrust the surface through to the programme structures behind. At the same time this implies the questioning of the taking for granted of proprietary programmes and formats. Putting both together, the answer to the original question concerning the position of Cultural Studies within the New Media can only be taken up where the thread was lost in 1984. This was in the ethos of free cooperation with the aim to increase knowledge of some form of community altogether and at the same time to develop something mutually. Only when there is constant reflection on the medial conditions of the New Media, when meaning-attribution is not confused with information storage in daily interaction with the computer, only then can multi-media teaching and learning forms be integrated in the "normal business of higher education", as Edelgard Bulmahn never tires of emphasising. And even if the President of the conference of vice-chancellors, Klaus Landfried, threatens a seemingly chaos-theoretic "self-organisation of learning" there is only one example for this in the history of the New Media, that of the open-source movement. The demand raised by Cultural Studies for a linking of reflexive and practical media competence will in future no longer be able to sneak around the view that cultural reality henceforth will indeed also be written in computer languages.
Stallman, Richard (1988). The GNU Project.
Foerster, Heinz von (1981). Observing Systems. Salinas.
Luhmann, Niklas (1995). Die Kunst der Gesellschaft. Frankfurt a.M.
Serres, Michel (1980). Der Parasit. Frankfurt a.M.
1 Revised version of a lecture in the context of the workshop "Internet in der Lehre" at the Humboldt University in Berlin on 22 September, 2000. Translated by Kate Chapman, Berlin.
2 In the eighties the Bell Labs gave the Unix source code to the universities at cost price. After commercialisation in 1984, $100,000 and more had to be paid for it. And today if need be the crown jewels can be "purloined" out of Redmond. Cf. Richard Stallmans article "The GNU Project", (1).
3 Cf., e.g., (2), (3) or (4).
4 Apple's most recent operating system number 10 appears all too innocently with an X in its name, so that it must surely prove to be a real neXt step: "Mac OS X takes the graphical user interface to the next level of computing - and beyond." Thus the main advertising message.