-hc / "Incommunicado" Amsterdam Workshop, Jun'05 / draft FORMATTING THE MESSAGE ====================== Roberto Verzola introduced the "intermediary", the role of ICT owners as rentiers - the cyberlords - of the globalised planet. I'd like to expand this analytical concept along another, so-to-say "inner" dimension: namely, how this works even inside the ICT realms. Certainly, the ITC business has a same economical role even in the very environment which profits most from it, and where the cyberlords indeed have their castles, the "industrialised" countries of the OECD. And certainly, the nominal share of their rent is grossly underestimated in the usual calculations of GNI and the like (OECD counts the GNI share of ICT at an average of around 7 per cent only: but the sampling of this data is simply ridiculous[1]). For everyone of you working with the stuff it's quite obvious how far ICT has invaded everyday "economically active" life (and beyond); and those who have to do an institute's or organisation's yearly accounts - or even the tax declaration if you're an "independent" - it's clearly visible how more and more of its growing share of "costs" goes not so much to hardware but ever more to licenses and "services" of all sorts which feed the cyber rentiers. Some time ago I did a small survey among a number of NGO offices in Brussels which revealed that their ICT costs had clearly passed costs for their core "business" of lobbying, all phsical mobility and costs for face-to-face meetings. But there is more to it. All communication is "social" - which is an apparently trivial observation. However, look at it the other way 'round: if there were means to make this intermediary function of communication tributary to rent payments this could be an enormous source of extra profits - and this the more so the nearer you can set the point of taxation to a lowest possible level of so-to-say "human hardware". In terms of the mechanics of communication this would be the very cognitive activity, a person's capacity to recognise an external stimulus as a meaningful sign - to identify from the noise those phonems and sounds which make up spoken language and music, and to recognise among the pixels on a monitor screen the letters and signs which make up written text and meaningful visual impressions: Thus, if you allow to communicate the letter "a" only if the originator of a message using this letter "a" has paid a toll due for using it in a specific form or "format", in order to be allowed to communicate it; and corresponding side, if you make the letter "a" visible only after the receiver of the message likewise has paid an access toll. Thus the real thing would be to get at the most abstract, and at the same time, the most basic[4] layer - which is writing; and more elementary: the very form -- or format of typeset -- of letters and signs which constitute the symbols of written content. But how do you make people subject to pay tribute(s) ? In the olden days, forcing people to pay a levy was straigthforward waylay: to visit a cousin in the neighbouring village was to meet the risk of getting robbed on the way through the woods or at the toll gates of the local baron. In the digitalised environment, the club of the robber, or the axe of the bailiff, is the proprietary format. [Besides: No debate intended here on the relative importance of audio vs. visual cognition; it seems clear though, that "written" language is dominant - even on the most technical level, where all sorts of contents indeed is "coded" in written form. Face to face talk has natural limits though.[3] But think what's involved in a gathering like this here: use of the microphone/headphone involves probably well over a thousand patent licenses which had to be served, and if there's just one of INTEL's microprocessors involved in controlling the sound here, this would make it a good hundred thousand.[2] Or another example: The explosion of GSM/cellphone use, and there again, of "SMS" - but all that is still dispenseable, still not low-level enough to be unavoidable. ] But there had been some obstacles yet to exploit this taxation at an indeed basic level of communication. In the early days of - more or less "personal" - computers it was the hardware manufacturers who had the say, and they had a strong position vis-a-vis the copyright holders of software. Neither was turnover important: the once-off payment for the copyright for use of just one typeset did not weigh through in the total cost of BIOS chips, for instance, which then gave the one and only type used for reproduction on screen. "Line printers" had a rather restricted set of fonts; and printer manufacturers finally lost the battle over the fixed fonts and had to adapt to the software-driven reproduction of pixel-dots. Today we are almost there. The LARGEST part - and this is a very circumstantial attribute, in order to avoid a libel suit - of all practically useable typeset for letters and signs by now is privatised "intellectual" property of two firms, Adobe and Microsoft.[4] However, what was still missing up to now, was a watertight system for the control of use. Sure, vendors of other composing software had to steer clear of suits and did duely pay their license fees if they wanted to deliver the most popular fonts within their program packages. [And BTW, it's these B2B relations which are subject of those "IPR" negotiated presently at the level ov WIPO and WTO-TRIPS.] The widely open field to escape from the rent levy was the ease and near-zero cost of the digital copy. Thus each and every software package sold through so-called "legal" channels nevertheless was in itself the source of uncontrollable numbers of copies which had not gone through the toll gate. Stories about the "fight against software piracy" abound; I need not add any. What is less recognised, however, is the link between this type of artificial "illegality" with the "security" issue. Perhaps there's time to go more into the technical details of this but it seems important to point already here to the instrumental role of "Angst" and fear for insecurity which was decisive to drive through the building of the control structure for digital copyright. Work on that had been under way for some years, with finally the unfolding of the TCPA - the "Trusted Computing Platform Association"(1998) - and its centralisation into the TCG, the "Trusted Cumputing Group" (since 2003) with the restricted membership of only the most powerful cyberlords. By now we _have_ the TPM - the "Trusted Platform Module" -, the chip or chipsection welded into every mainboard sold; and, even more important perhaps, into "embedded" processing gear like hendhelds, cellphones and all sorts of special purpose instruments. What is less known is that Microsofts's "Windows XP" operational system at the outset was intended to use "Trusted Computing", only that neither the hardware was ready nor the clients were ready to accept it - Intel provoked an outcry with its CPU-identification number which became part of the Pentiums-3 architecture in the mid-'90ies, and M$uck experienced a major flop with its attempt to tweak the Internet into a "Microsoft Internet"; the present Mickey-".Net" is only a feeble resemblance of the originl project. And finally here comes the club for cyberrobbes and barons: From the end of this year on, the next M$ OS named "Longhorn" must be installed on "all" PCs - M$ has announced to stop support for its present "Windows", even the "XP", after 2006. The new operational software will, together with the TPM chip, simply control _all_ input and output and check it against its "legality". According to the official announcement of "Longhorn" in April, this copyright watch is in the first instance designed to check all video and audio signals of "high" resolution and "high" quality which are produced following the most recent copyright protocols of the "content industry" and thus have built-in control features. So this is precisely that "Digital Rights Management" or "DRM" which had been pursued with the "Digital Millennium Copyright Act" in the USoA since the pre-Bush era and with a series of recent "modernisations" of patent, copyright and broadcast laws in Europe. With the difference that this is not a rule on paper but hardwired technical solution in the machine and its software. Two things are quite clear from these first official M$oft announcements: Firstly, "external", "imported" source materials which lack any "Digital Rights Management" specification will be either reproduced in artificially decreased quality, or not at all, with some of the interfaces to external devices. For instance, the most traditional and simple video and audio wired "line" connections will not be functioning any more, if peripherals there do not have the least of the presently required anti-"copy protection". Secondly, it is quite clear from the drawings that this "protected environment" for audio and video input and output does check the "rights" of just every "application" you want to run on future "Windows" machines. That is, just every software and program used must have some sort of "certificate" of origin so that the "protected" system allows it to run and to work with "protected" content. So much for even your handicrafted font which has not been, and cannot be "legalised" with the "Trusted Platform Module" in your machine. The TPM finally will help to fight spam and evil hackers too, it is said. Email and just any packages arriving from the internet will be checked against their "legitimate", unique identification signature of the originating machine. And yes, so-called "older" PCs will not work so well with it, you have to buy new.[5] While some newer peripherals like TFT-monitors and DVD burners will only work with "Trusted Computing" activated, or not at all.[6] What DRM implies: ================= A short recourse to basics again - the PC is fundamentally a stand-alone production unit (with the additional benefits it can draw from the Net.) Certainly, there was the precursor of the "dumb terminal", connected to the "mainframe" - but remember that for years, the primary means of exchange between units, between PCs was some media: at first the exchange of tape, then of various diskette formats, but in any case physical media exchange between separate, stand-alone units. The marriage with telecommunication, let alone networking, was not at all perceived in the begining. On the other hand, the origins of the internet clearly stem from the other end of the spectre, i.e. interconnected mainframes; and the affinity of large telecom switchboards with the scaled-up switchboard architecture of computer "mainframes" was almost a matter of nature. There, input/output functionality and "human interfaces" were considered as marginally important. (Just remember those telex tapes and punched cards which served for input, and those monsters of IBM printing machines with the jumping bars of a full length of alphabetic letters which hammered on the paper from endless zigzag folded forms.) This dialectic relationship between the two lines of development for today's networking should be kept in mind -- and, I think, should be maintained, seen the present push towards re-centralisation on a much more abstract level: The individual, stand-alone and by now even mobile unit has been tweaked and seduced to become the vehicle for penetration of a pre-determined and lopsided type of communication, in an ever stonger push for centralisation of networking - imagine he hyped web "portals" as the mainframes of the cyberlords. At the same time, the capacity of the stand-alone unit as a production tool all by itself, so to say, of being its own mainframe, is the very means of resistance against that centralised control. Anyway, in the course of events the traditional means of producing "communication" - and even more so, of traditional "mass communication" by broadcast, printed newspapers and the like - have been profoundly changed: Physical reproduction of "content" (!AND, by the way, its share of the cost of production!) becomes completely de-centralised - "content" has not only to be re-produced but indeed it is re-constituted at the receiving site. (A.) HENCE, there is the demand, from the originator, for "control" over form and content of the original "product". This is doubtlessly a legitimate argument, from the point of view of a creator, but it has far reaching implications given the economic realities in the given conditions of networking -- it reinforces the push towards a "presentation" culture in communication(s), and demands total control over the reconstitution and (re-)presentation. The intervening condition is that anything produced must be of economical value - or profit -, that is, at a price which has to be realised by successfully selling it. The inning (collection) of the price is the salient point here - there must be a tight control that the buyer/taker doesn't run with the product without paying. The only way to exercise this control is through the form, _not_ through the "content": the idea of the letter "a" is too abstract, and too variable, to ever get a grip upon. Thus its holdfast form, the copyrighted "format" of its presentation is the hook to catch that profit. This is the "qualitative" angle to a tendentially totalitarian control of "formatting". [Qualification: it's not the author/originator who is "totalitarian"; that one is simply "unique" in the choice of expression, and this is not a matter of pluralist decision or not. The element of total control comes with the procedure of (net-)distribution and (digital) re-construction of "copies".] (B./1.) And then there is the "quantitative" angle to it: Quick recourse again to technical history. The stand-alone, "individual" nature of the PC was the real "revolutionary" step in that whole computing history. "Revolutionary" in that it created the material means for "anarchy" - hitherto, the treatment of masses of information was the priviledge of administrative hierarchies, the army of bureaucrats of the powers that be (in business as well as in government, but likewise in the social hierarchy of academia); now it was at the reach of just "any" citizen. [Maybe we can have it about the intial "prices" and the process of "innovation" at another occasion - in this context here, it may be sufficient to point to the difference between real and false technical "innovations": Criteria for false ones would be to restrict, instead of to enlarge, the "use value" of things. A typical example for false innovation would be the new version of M$ Void which cannot be accessed by the previous one.][7] So this creates another problem for the "control" of whatever use of programs and data: the sheer number of individual units/users to control. The successful way is not necessarily through the basic Operational System but through the installation of a yet more abstract set of "rules" (or "layers" or "protocols") which have to be served by OSs as well as all application programs; so programmers as well as hardware manufacturers would do best to comply with these and support them. That's exactly what M$-"Longhorn" does - but so does even Linux: work is in progress which will warrant the compliance of the very Linux "kernel" with the prescribed checks through using the TPM control chip. Thus, the even major "alternative" platform will precisely reinforce the scope of use of restrictive "rights". Just to mention: Already now the list of devices with "hardwired" copyright check compliance is impressive. To name but a few: DVD drives; FireWire encryption/decoding ("Digital Transmission Content Protection"); USB memory sticks ("Magic Gateway"); Serial ATA HD interface; audio SPDIF interface; TFT displays (DVI interface); TV analogue interface (even blocks copies from VHS video); USB - for exchange media of all sorts (already implemented in:) DVD-Videorecorders.[8] (B./2.) Along a quantitative dimension, the expansion of Internet use created another "challenge" for centralised controls. The new quality of the Net (and the Web) was its very constituency of "peer-to-peer" units connected. This is all too easily forgotten in the noisy row about the Napsters and the like. Almost as eagerly _made_ forgotten ist the role of the EU, and specificly the EU Commission, in privatizing the WEB: The WWW development was done, and was in its entirety publicly financed through the European reserch institution CERN; and CERN is not even a common instance restricted to EU members. Nevertheless, it was the EU Commission - and there specificly, its German member Bangemann - which pushed end steered for establishing a proivate "industry" consortium and thus enclosed the W3C from the public commons. And even if the W3C tries to do its best to maintain WWW standards public, it cannot by structure and construction avoid inroads of if most important corporate members, allowing most importantly the expansion of the use of proprietary formats. Nor could it avoid the very structural change of the peer-to-peer Net of exchange to a commercially and server-push dominated "presentation" WEB; where exchange and "interactivity"has degenerated into mousclicks. Tim Berners-Lee, co-inventor of the WWW and still the W3C director, has some quite bitter words about this. Yet there are still some strong remains of the "public" nature of the WEB, with the TCP/IP (protocl), the command languages HTTP and CGI, and the mark-up language HTML as the public standard for page presentation. One can share the widely hold skepticism against the ITU, the international standards setting body, and specificly the costs and exclusivity of that inter-governmental club. [Steve Cisler raised some very pertinant points in that respect !] However, the maintaining of HTTP/CGI as the crucial standard for interoperability on the Net is doubtlessly a merit of that bureaucracy. I am a little less frightened than Lawrence Lessig about the hardware menace of the TPM chip; its use and some of the implications of its use can be avoided perhaps, for so long as the basic public standards for interoperability are left in force. But I am much more frightened by the eager stupidity with which all sorts of "public" actors and instances embraced proprietary formatting and thus already have created the precondition of DRM use. Just any visit at the websites of public instances shows the symptoms of the condition - .PDF, .DOC, FLASHes all over the place, like the barbed wires of the privatised coast line at the Cote d'Azure. (And not to speak of the plethora of proprietary pixel and sound formats.) A short extempore here: There is indeed a public, ISO standard named "PDF/X", which came about in 2001 and 2002 after endless squabbles - Adobe is member of the pertinent standards committee but reportedly has been one major reason why standardisation is far behind of the present paxis in the printing business; there, the recent version of the Adobe Acrobat tools is the business condition, and a number of its features are incompatible with the ISO standard for PDF. And do not think that Adobe's so-called "free"[9] "Acrobat Reader" could access all the features of the original Adobe PDF file format. Sure, Adobe does publish most of the file specification (which in itself is apparently good business: you would not get the more than 1000 pages for nothing.) But no-one can and does restrict the firm to use own, undocumented properties [sic], or to change them ad lib.[9] No great wonder that none of the other available tools to treat PDF files can indeed offer the full feature set. And the problem is not confined to the files and their format as such, but extends to the "embadded" components in particular, like the fonts to use. Note that seemingly standards conformant things, like HTML-formatted eMail attachments, are soaked with proprietary elements, perfectly ready to be subjected to "Digital Restricted Managment". Some examples[10] taken from eMails of the organisers of this very conference can show that even attentive people can all too easily fall victim to the built-in traps. In these three seemingly harmless, standards compliant "attachments", each time at least one, in case of the Apple-Mail produced, Microsoft-proprietary, "rich text format" a fully five proprietary typesets are required for reproduction of the originals. All these are to be subject to "legality" checks by DRM; and, most important, to the license payment asked for their use. In addition, even these much cut-down quotes give some hint to the enormous waste involved with this sort of "ICT" - the colour prescription in the last of the three is visibly telling. One side efect but one of some weight and of relevance here, is the redistribution of profit from this scandalous wasteful abuse of resources into the pockets of "Northern" cyberlords in the context of globalised North-South (tele)communication: I think it can be shown that this alone by far surpasses the most "optimistic" marks of "official development aid" set for the so-called "Millennium Development Goals". I cannot but mention my sad impression about the role of NGDOs, of "non-governmental development organisations", in this context. Despite of their expertise in the fields they work in, despite of many of their well reflected and often thoroughly critical positions towards "Northern" exploitation of the "South", almost all their daily practice in their core activity - which is "communication" -, reproduces some of the worst forms of rent extraction for the cyberlords. Just one example from a Brussels based network active with solidarity work and support for Sahel peasant farmers: Their own, excellent equipment was paid for with grants from the EU and the Belgian government - and their correspondent "partner" NGOs in Timbuktu and other places of the Sahel region have been forced ever since to lay out three quarters of their budget, or even more, for investment, maintenance and upgrading of the Apple made communication gear, just in order to share the wisdom of the Brussels experts. Or I could invite you to have a look at the website of the umbrella organisation of all the EU's Development NGOs, national "platforms" and transnational "networks" all combined, Just to dial-in to their homepage from a hotel in Dar-es-Salaam or Abidjan could well produce a bill of 10 to 30 Dollars or EUROs. And you wouldn't be nowhere yet without the very latest versions of proprietary software from Adobe and Microsoft on your laptop: because you would have no chance to "share" the quite excellent output presentation of Europe's development promotion elite. Now you would go - and I mean: go, walk, because there's no transport and you just spent the money for a taxi on the telco bill -, you go thus to your friends around the corner some 8 to 10 kilometers away at the university campus. Where the (luckily donated) Pentium-I with its black-and-white monitor sits dead in the corner of the institute, as the electricity had just gone again, and its robust old needle printer would have needed a new ribbon anyway. Then please explain to them the importance of a four-colour print-out, including high-resolution photo illustrations, of the latest policy statement from the EU's most prestigious lobby of their development interests. [But back to realities here in the "North":] Thus, with the field well prepared, the critical mass has been achieved to close the trap: With "Digital Rights Management" installed on the majority of computers, each and every use of a "proprietary value" can be checked and tracked. Arguably it will be just a matter of time _and_ of the "securisation" of NET traffic - especially for micropayments - that this check-up of the "legality" of used copyrighted material (for which there is the license rent to be paid already for its installment) will be extended to a pay-per-use mode, and with this will increase enormously the rent cashed in by the cyberlords. (C.) The "security" issue is indeed the final argument used to justify "trusted" net-connected computing. The TCG reasons quite formally that their chip would allow the unambiguous identification of the "trusted" machine. And M$'s "Longhorn", it has to be added, claims unambiguous identification of a "legitimate" user to that machine; besides of passwords for "secure start-up" of a Longhorn mad cow, M$ envisages number of additional identity checks, biometrical gadgets like finger prints or iris or voice recognition, as well as "smart card" readers. The weirdness of this reasoning is that a "secure" network does and did exist long before the public internet - the teller machines, or cash automats, rightly are connected through perfectly separately wired networks, and there's no need whatsoever to double this already existing infrastructure. The transposition of this "secured" concept of "trusted"[11] machines and their network into the public realm of the internet sure has other, and undeclared reasons. The implementation of DRM is is definitely one of them, and the most important. But I think there's more to it. By attacking the very "nature" of the "public" net, and by attempting its enclosure, it aims at the vulnerable heel of Free/Open Software, namely, its very dependence on this (inter)NET: while FOSS development had depended on the Net as an "alternative" means of distribution, it has neglected the development of an alternative "plan B" for the case of the Net being highjacked by its adversary.[12] So what is needed ? To my dismay, I did not see much of an approach in the first phase of the WSIS to freeward the very basics of standards for free, non-proprietary interoperability of the Net. Great declarations for "freedom of expression" and "fundamental rights" of such seem quite useless if they are subject to pre-paid licence fees for the use of the copyrighted file formats and typesets these declarations are printed with. Secondly, there seems no alternative to the ITU as the international regulatory body - and you may like or dislike its bureaucracy, and besides of its subversion by the business-lobby[13] -, if you want to pass binding, suable safeguards for the least public standards. The few rules "governing" the basic technical functionality of the Net had been precisely that. Thirdly, there is definitely a point to demand more "public" participation - i.e., beyond state bureaucrats and business-lobby - in ITU's procedures; and more opening of access to information of its work. ***