Nothing happens unless first a dream. (Carl Sandburg)
In the bar of a Holiday Inn in Ipswich I find myself in conversation with the writer of a popular television drama, when the subject of Twitter comes up:
‘The thing is, Dan,’ he exclaims with a characteristic pause for emphasis or comedy, I could never tell which, ’I don’t want everyone to know when I’m sitting on the toilet.’
He has a point. Though currently this sort of information is only made available if you choose to make it so (and I’m sure there is an app for that) it is a perfectly rational fear. When we trace the origins of the internet and observe how it, and we alongside it, have evolved, and where it all seems to be heading, it is not unreasonable to assume that details of this nature, and indeed every other facet of daily life, may eventually be shared or even required by the system. For now, however, if you wish to broadcast ‘#numbertwo’, it’s up to you.
The Internet is the current tail of our communicative development: from telegraphy through telephony, radio to the personal computer, our connective network systems have evolved since the first remote message exchange. It began in strictly academic circles, developed primarily to assist in the sharing of research and resources. Yet the web always had a human interest; from J.C.R. Licklider’s delightful concept of a ‘Galactic Network’ at MIT in 1962 to Doug Engelbart’s 1969 Stanford research project, the ‘Augmentation of Human Intellect’, it was seen that by making computers talk to each other, to find a common ‘language’ between two machines, we might mirror something of our own development, might learn from it and improve our lives.
Early advancements came about through missives known as ‘RFCs’ (or Request For Comments), initially sent via physical snail-mail (remember that?) then quickly replaced by primitive email. Users would keep each other informed as to how the general infrastructure of the online experience was working and how it could be improved. Just as the concept grew out of our own invention, so it developed and evolved with our own feedback. The development of the Internet was in itself a social network.
We have assimilated these advances linguistically too, so that they seem to fit naturally and grow alongside us. Words such as ‘host’, ‘domain’, even the ancient precaution of a ‘firewall’ all evoke a solid, historical idiom; a brave new world that has such familiar terms in it. ‘Forum’ is a perfect example: the ancient roman marketplace where one could find so much more than just market stalls: sales pitches, entertainments, side-shows, soliciting, and drunken japery as well as nodes of discussion, arguments, opining and pontificating, quite aside from everyday bitching and gossip. Where the early digital lexicon involved many hybrid terms or neologisms, a few of which survive (e.g. ‘blog’, ‘download’, etc.), the mainstay of our virtual vocabulary is drawn from our own preexisting, ‘real’ world. As the ‘appliance’ was to electricity, so is the ‘application’ to the Internet. The ‘architecture’ of the online world has its topographic ‘sites’ created by ‘developers’: all of it sounds like it has been here before.
Of course, as with something like the ‘Hoover’, there are certain brands that dominate a new field and become words in their own right. One of the more significant of these is ‘Google’; itself a misspelling of the word ‘googol’, the expression of the number 10100. Google has always occupied a curious position in the online sphere and has achieved something of Darwinian prominence in its survival and defeat of other lesser search engines and operations: how many ‘Chrome’ users remember the web’s first ‘browser’, ‘Mosaic’? Who now asks that tired old butler ‘Jeeves’…?
When the futurist and inventor Ray Kurzweil was signed up as Google’s Director of Engineering a few weeks ago it seemed to be a moment of great significance. The visionary, the pioneer of technologies such as optical and speech recognition and text-to-speech, one of the great prophets of artificial intelligence and the Cyber Age was officially absorbed into the belly of the beast. The brand, indeed the word, which has infused our daily lives had co-opted the services of this digital dreamer. But nothing happens unless first a dream. The title of Kurzweil’s best-selling book now begins to look like a statement of fact: The Singularity Is Near. The age of super-intelligence is practically upon us; the point at which the global supercomputer has received enough feedback, enough information that it can develop and improve itself so as to be self-governing, even self-conscious; the point at which the machine has learned to mimic human behaviour and subsequently to improve upon it; beyond which anything might be possible.
In a recent issue of Wired magazine, ‘Senior Maverick’ Kevin Kelly writes of this imminent ‘singularity’, the eventuality that the robots will ‘take over’, in an essay entitled ‘Better Than Human’. In it he includes a visual matrix of new and existing jobs that humans and/or machines can and cannot do. He briefly touches upon the progression of humans towards roles that they, uniquely (‘for now’), can fulfill:
‘ballerinas, full-time musicians, mathematicians, athletes, fashion designers, yoga masters, fan-fiction authors, and folks with one of a kind titles on their business cards.’
He qualifies his tongue-in-cheek list, projecting that ‘of course, over time, machines will do these as well’. I’m not so sure; after all, the Luddites were wrong. Fear of unemployment caused by mechanisation, or what economists call the ‘Substitution Effect’, proved unfounded since a conflicting ‘Output Effect’ in fact created work, albeit displacing it. So today there is a shift, a displacement. Kelly’s list seems to echo the less ironic words of the Dalai Lama, that the planet ‘desperately needs more peacemakers, healers, restorers, storytellers and lovers of all kinds’, roles that will remain human long after many others have gone the way of all wires. It is the quintessentially human characteristics that will prevail; curiosity, imagination, innovation, compassion, irony, humour. ‘Drop by drop’, as Chekhov wrote, we ‘squeeze the slave’ from ourselves: our roles become ever more prescribed by the technology we have created and perpetuated, and consequently our introspection increases. The network and its resulting gadgetry is now such an inextricable part of our lives and is so entwined with our evolution that instead of it distancing us from ourselves it is forcefully reminding us that we exist and, hopefully, awakening us as to why we exist.
Feedback loops of all kinds are vital, even those not directly contributing to the development of the Internet itself. Our online presence stands as a Book of Reckoning; every click and comment defining us. Supposed anonymity perpetuates a lack of accountability and therefore trawling (or trolling) the ‘bottom half of the internet’ reveals how far from greatness we can sink. But the anonymous, pixelated veil is an illusion; whilst the individual identity seems hidden (for now), this input shapes us all, standing as a record of our thoughts and motivations. It is our collective consciousness made visible, viral warts and all. Leaning towards digital optimism we face a conscious choice and a responsibility to our own self-improvement, to find the good. Tweet as you would be twoten to.
In his 1950 paper ‘Computing Machinery and Intelligence’ the mathematician and computer scientist Alan Turing upgraded his initially proposed question, ‘Can a computer think?’ to ‘Can a computer do what we (as thinking entities) can do?’ Now that the answer to that question is almost ubiquitously ‘yes’, the questions we face are ‘What is it we can do that the machine can’t?’; ‘What is it the machine may never be able to do?’ Go to the toilet? To laugh? Perhaps to love?
To comment on an article in The Junket, please write to comment@thejunket.org; all comments will be considered for publication on the letters page of the subsequent issue.