Tuesday, May 30, 2017

The Way Ahead (or Pandora 5.0)

The great Canadian Marshall McLuhan –– philosopher should one call him? – whose prophetic soul seems more and more amazing with each passing year, gave us the phrase the ‘Global Village’ to describe the post-printing age that he already saw coming back in the 1950s. Where the Printing Age had ‘fragmented the psyche’ as he put it, the Global Village – whose internal tensions exist in the paradoxical nature of the phrase itself: both Global and a village – this would tribalise us, he thought and actually regress us to a second oral age. Writing in 1962, before even ARPANET, the ancestor of the internet existed, this is how he forecasts the electronic age which he thinks will change human cognition and behaviour:-
“Instead of tending towards a vast Alexandrian library the world will become a computer, an electronic brain, exactly as in an infantile piece of science fiction. And as our senses go outside us, Big Brother goes inside. So, unless aware of this dynamic, we shall at once move into a phase of panic terrors, exactly befitting a small world of tribal drums, total interdependence, and superimposed co-existence. […] Terror is the normal state of any oral society, for in it everything affects everything all the time. […] In our long striving to recover for the Western world a unity of sensibility and of thought and feeling we have no more been prepared to accept the tribal consequences of such unity than we were ready for the fragmentation of the human psyche by print culture”.
Like much of McLuhan’s writing, densely packed with complex ideas as they are, this repays far more study and unpicking than would be appropriate here, but I think we might all agree that we have arrived at that “phase of panic terrors” he foresaw. Let me suggest a few of the anxieties we feel about the digital world today:
  • Aside from the ugliness and ferocity of trolling on social media, we fret over the so-called post-truth age, with its arguments over what is ‘fake news’ and what are ‘alternative facts’ and the concomitant diminution of trust in any authoritative source of information, or consensus as to the validity and credibility of news and current events at all.
  • The refusal of social media platforms to take responsibility for those dangerous, fake, defamatory, inflammatory and fake items whose effects would have legal consequences for traditional printed or broadcast media but which they can escape.
  • The rise of big data and one’s personal footprint of analytics, spending and preferences, becoming, willy nilly, corporate property. The ever present threat to our privacy that this involves concerns us. Everyone we know, everything we read, watch, listen to, eat, everything we desire and perhaps every electronic message we send – all readable by a corporation or a government.
  • The unfair non-contractual working practices afoot in the so-called Gig Economy – Uber Drivers, delivery couriers and so on. Not to mention the effect of those services on the pre-existing workforce of cab-drivers and others whose hard won qualifications might be set at nought.
  • The ghettoisation of opinion and identity, known as the filter bubble, apportioning us narrow sources of information that accord with our pre-existing views, giving a whole new power to cognitive bias, entrenching us in our political and social beliefs, ever widening the canyon between us and those who disagree with us. And the more the canyon widens, the farther away the other side and the less likely we are to hear or see what goes on there, intensifying the problem and always at the super new speeds this digital new world confers.
  • The threats to personal, national and transnational security – threats emanating from ‘bad actors’ that might be cyber-extortionists, unscrupulous corporations, unfriendly foreign powers, intrusive domestic governments and their agencies.
  • The threat to the young of grooming that leads to abuse, or of recruitment that leads to extremist and violent ideologies and actions.
  • Algorithms continue every microsecond to harvest data of my movements, by GPS for example, analyse my actions, read my gmail, build up information on my mood, sexuality, political, religious and cultural affiliations, habits and propensities – such data is for sale …
  • Bullying – especially of the young. Body shaming. Blackmail. Extortion. Revenge porn. On-air suicides, encouragements to self-harm and live-streamed violence.
  • The corporate assault on net neutrality.
  • The fragile security of our entire digital world and the ever-present looming possibility of a Big One, that cataclysm brought about either by malice, act of war, systemic technical failure, or some other unforeseen cause, an extinction level event which will obliterate our title deeds, eliminate our personal records, annul our bank accounts and life savings, delete all the archives and accumulated data of our existences and create a kind of digital winter for humankind.
These are some of the things that rightly worry us. An example of every one of them can be found almost daily in a story on-line or in the mainstream so called dead tree media. All of them individually, or in a potential catastrophic avalanche, threaten to engulf us. One thesis I could immediately nail up to the tent flap is to join in the call for aggregating news entities like Facebook to be legally classified as publishers. At the moment they are evading responsibility for their content because they can claim to be ‘platforms’ rather than publishers. Given that they are the main source of news for over 80% of the population that is clearly an absurd anomaly. If they and Twitter and like platforms recognise their responsibility as publishers it will certainly help them better police their content for unacceptable libels, defamations, threats and other horrors that a free but legally bound press would as a matter of course be expected to control. But that correction of the legal standing and responsibility of social media platforms is almost certainly going to happen and soon, and is, frankly, small potatoes – as, to some extent, are the other anxieties I’ve outlined. For there is so more, so much more coming – as they say in America – down the pike. Some huge potatoes are looming on the horizon. (...)

If intelligent systems can design systems more intelligent than themselves, the exponentially steep rate of improvement will dizzy our minds. It’s very important to keep this perhaps obvious point in mind: in the field of technology we never arrive at a state of finished satisfaction. The way things are now is not how they will be in two years time. Heraclitus said you cannot step into the same river twice, for fresh water is always flowing past you. The technological stream similarly allows for no sense of stasis. Technology is not a noun, it is a verb – a process. We know that our economics is in flux too, predicated and dependent on growth, growth, growth. What we have to accept is that there has been a confluence of that economic imperative for growth, Moore’s Law of ever-increasing computational power, human curiosity and ambition and our very particular kind of consumer addiction and need for the new – all of which have swollen the river of technological progress into flood.

Great gifts will come this new phase, from Pandora 5.0, of course they will. Let me sketch a few more or less at random and far from complete. History teaches that everything I say will be an underestimation. So, AI, robotics and smart devices in the biotech and medical sphere are already coming on line, the NHS has a deal with Google’s Deep Mind machine-learning AI (originally a British company Deep Mind is now the world champion at the game Go, which it taught itself), this kind of AI in the clinical realm will offer earlier diagnosis, the ability to read medical imaging data with much more accuracy and spot incipient signs of disease, making radiologists for example redundant; in the area of virology and related sciences it can assist with analysis of amino acids, protein structures and the creation of serums and treatments hugely accelerating drug development; we will see the manufacture of greater and better cybernetic prosthesis, bionic eyes, ears and limbs; more robotic surgery, faster and more accurate genetic analysis, genotyping and biometric data; brain computer interfaces, will allow thought and dream reading, the operation by thought alone of machinery, devices, musical instruments, paint brushes, tools; brain machine data input and output will transform a huge number of activities and operations allowing the happy combination (harnessing Moravec’s paradox) of the best human abilities of motor skills and perception with the best machine abilities of calculation and precision; we will see care robots for the elderly, cyber Mary Poppins guardians and babysitters for children and the vulnerable. The fight for greater longevity will unquestionably rely on AI techniques and usher in the possibility of the conquest of death itself. We are doubtless used to hearing that the first human to live to 200 years old is already alive, the younger people in this room can certainly expect to break the 120 barrier. I have been told by more than one solemn-faced scientist that the first person to live to 1,000 is probably alive and that immortality is technically and feasibly within reach. In other arenas, not counting the world of work, we will see better weather forecasting, an amelioration of traffic flow, automated shopping and delivery. A diminution of human error in multiple areas of exchange and interaction will lead to all kinds of undreamed of benefits.

The next big step for AI is the inevitable achievement of Artificial General Intelligence, or AGI, sometimes called ‘full artificial intelligence’ the point at which machines really do think like humans. In 2013, hundreds of experts were asked when they thought AGI may arise and the median prediction was they year 2040. After that the probability, most would say certain, is artificial super-intelligence and the possibility of reaching what is called the Technological Singularity – what computer pioneer John van Neumann described as the point “…beyond which humans affairs, as we know them, could not continue.” I don’t think I have to worry about that. Plenty of you in this tent have cause to, and your children beyond question will certainly know all about it. Unless of course the climate causes such havoc that we reach a Meteorological Singularity. Or the nuclear codes are penetrated by a self-teaching algorithm whose only purpose is to find a way to launch…

by Stephen Fry |  Read more: