Sunday 11 January 2015

THE NUMBER OF EVERYTHING

Dick Pountain/Idealog 240 /09 July 2014 11:45

My interest in computing has always been closely connected to my love of maths. I excelled in maths at school and could have studied it instead of chemistry (where would I be now?) My first experience of computing was in a 1960 school project to build an analog machine that could solve sixth-order differential equations. I used to look for patterns in the distribution of primes rather than collect football cards - you're probably getting the picture. I still occasionally get the urge to mess with maths, as for example when I recently discovered Mathlab's marvellous Graphing Calculator for Android, and I'll sometimes scribble some Ruby code to solve a problem that's popped into my head.

Of course I've been enormously pleased recently to witness the British establishment finally recognising the genius of Alan Turing, after a disgracefully long delay. It was Turing, in his 1936 paper on computable numbers, who more than anyone forged the link between mathematics and computing, though it's for his crucial wartime cryptography that he's remembered by a wider public. While Turing was working on computable numbers at King's College Cambridge, a college friend of his David Champernowne, another rmathematical prodigy, was working on something rather different that's recently come to fascinate me. Champernowne soon quit maths for economics; studied under John Maynard Keynes; helped organise aircraft production during WWII; in 1948 helped Turing write one of the first chess-playing programs; and then wrote the definitive book on income distribution and inequality (which happens be another interest of mine and is how I found him). But what Champernowne did back in 1933 at college was to build a new number.

That number, called the Champernowne Constant, has some pretty remarkable properties, which I'll try to explain here fairly gently. The number is very easy to construct: you could write a few million decimal places of it this weekend if you're at a loose end. In base 10 it's just zero, a decimal point, followed by the decimal representations of each successive integer concatenated, hence:

0.12345678910111213141516171819202122232425262728293031....

It's an irrational real number whose representation goes on for ever, and it's also transcendental (like pi) which means it's not the root of any polynomial equation. What most interested Champernowne is that it's "normal", which means that each digit 0-9, and each pair, triple and so on of such digits appear in it equally often. That ensures that any number you can think of, of whatever length, will appear somewhere in its expansion (an infinite number of times actually). It's the number of everything, and it turns out to be far smaller (if somewhat longer) than Douglas Adams' famous 42.

Your phone number and bankcard PIN, and mine, are in there somewhere, so it's sort of like the NSA's database in that respect. Fortunately though, unlike the NSA, they're very, very hard to locate. The Unicode-encoded text of every book, play and poem ever written, in every language (plus an infinite number of versions with an  infinite number of spelling mistakes) is in there somewhere too, as are the MPEG4 encodings of every film and TV programme ever made  (don't bother looking). The names and addresses of everyone on earth, again in Unicode, are in there, along with those same names with the wrong addresses. Perhaps most disturbingly of all, every possible truncated approximation to Champerknowne's constant itself should be in there, an infinite number of times, though I'll confess I haven't checked.  

Aficionados of the Latin-American fiction will immediately see that Champernowne's constant is the numeric equivalent to Jorge Luis Borges' famous short story "The Library of Babel", in which an infinite number of librarians traipse up and down an infinite spiral staircase connecting shelves of random texts, searching for a single sentence that makes sense. However Champernownes' is a rather more humane construct, since not only does it consume far less energy and shoe-leather, but it also avoids the frequent suicides -- by leaping down the stairwell -- that Borges imagined.

A quite different legend concerns an Indian temple at Kashi Vishwanath, where Brahmin priests were supposed to continually swap 64 golden disks of graded sizes between three pillars (following the rules of that puzzle better known to computer scientists as the "Tower of Hanoi"). When they complete the last move of this puzzle, it's said the world will end. It can be shown that for priests of average agility this will take around 585 billion years, but we could remove even that small risk by persuading them to substitute instead a short Ruby program that builds Champerknownes' constant (we'll need the BigDecimal module!) to be left running on a succession of PCs. Then we could be absolutely certain that while nothing gets missed out, the end will never arrive...    

I, ROBOT?

Dick Pountain/ Idealog 239/ 06 June 2014 09:56

Like many males of my generation I grew up fairly well-disposed toward the robot. Robbie the Robot filmstar was all the rage when I was 11, and Asimov's Laws of Robotics engaged my attention as a teenaged sci-fi reader. By the time I became involved in publishing underground comics in the early 1970s the cuteness was wearing off robots, but even so the threat was moderated by humour. The late Vaughn Bodé - nowadays beloved by all the world's graffiti artists - drew a strip called "Junkwaffel" that depicted a world cleansed of humans but gripped in permanent war between foul-mouthed, wise-cracking robot soldiers. In some ways these were the (far rougher) prototypes of R2D2 and C3PO.

Out in the real world robots started to appear on factory production lines, but they were doing those horrible jobs that humans shouldn't do, like spraying cellulose paint, and humans were still being employed to do the other stuff. When I got involved in computer programming myself I was drawn toward robotics thanks to an interest in Forth, a language originally invented to control observatory telescopes and ideally suited to robot programming. The problems of robots back then were all about *training* them to perform desired motions (as opposed to spelling out in X,Y,Z coordinates) and building-in enough intelligence to give them more and more autonomy. I still vividly remember my delight when a roboticist friend at Bristol Uni showed me robot ducklings they'd built that followed each other just like the real thing, using vision alone.

Given this background, it will come rather hard to have to change my disposition toward the robot, but events in today's world are conspiring to force me to do just that. While reading a recent issue of New Scientist (26 April 2014), I was struck by two wholly unrelated articles that provide a powerful incentive for such a change of attitude. The first of these involved the Russian Strategic Missile Force, which has for the first time deliberately violated Asimov's main law by building a fully-autonomous lethal robot that requires no permission from a human to kill.

The robot in question is a bit aesthetically disappointing in that it's not even vaguely humanoid-looking: it looks like, indeed *is*, a small armoured car on caterpillar tracks that wields a 12.7mm heavy machine gun under radar, camera and laser control. It's being deployed to guard missile sites, and will open fire if it sees someone it doesn't like the look of. I do hope it isn't using a Windows 8 app for a brain. Whatever your views on the morality of the US drone fleet, it's important to realise that this is something quite different. Drones are remotely controlled by humans, and can only fire their weapons on command from a human, who must make all the necessary tactical and moral decisions. The Russian robot employs an algorithm to make those decisions. Imagine being held-up at gunpoint by Siri and you'll get the difference.

However it was the other article that profoundly upset my digestive system, an interview with Andrew McAfee, research scientist at MIT's Center for Digital Business. Asked by interviewer Niall Firth "Are robots really taking our jobs?", McAfee replied with three possible scenarios: first, that robots will in the short term, but a new equilibrium will be reached as it was after the first Industrial Revolution; second, they'll replace more and more professions and massive retraining will be essential to keep up; third, the sci-fi-horror scenario where robots can perform almost all jobs and "you just don't need a lot of labour". He thinks we'll see scenario three in his lifetime (which I hope and trust will be longer than mine).

It was when he was then asked about any possible upside that my mind boggled and my gorge rose: the "bounty" he saw arising was a greater variety of stuff of higher quality at lower prices, and most importantly "you don't need money to buy access to Instagram, Facebook or Wikipedia". That's just as well really, since no-one except the 0.1% who own the robots will have any money. On that far-off day I forsee when a guillotine (of 3D-printed stainless steel) has been erected outside Camden Town tube-station, McAfee may be still remembered as a 21st-century Marie Antoinette for that line.

The bottom line is that robots are still really those engaging toys-for-boys that I fell for back in the 1950s, but economics and politics require the presence of grown-ups. Regrettably the supply of grown-ups has been dwindling alarmingly since John Maynard Keynes saved us from such imbecilities the last time around. If you're going to make stuff, you have to pay people enough to buy that stuff, simples.



THE JOY OF CODING?

Dick Pountain/ Idealog 238/ 08 May 2014 19:30

I've admitted many times in this column that I actually enjoy programming, and mostly do it for fun. In fact I far prefer programming to playing games. Given my other interests, people are often surprised that I don't enjoy chess, but the truth is that the sort of problems it creates don't interest me: I simply can't be bothered to hurt my brain thinking seven moves ahead when all that's at stake is beating the other guy. I did enjoy playing with Meccano as a kid, and did make that travelling gantry crane. I can even imagine the sort of satisfaction that might arise from building Chartres Cathedral out of Lego, though having children myself rendered me phobic about the sound of spilling Lego bricks (and the pain of stepping on one in bare feet). But programming is the ultimate construction game, where your opponent is neither person nor computer but the complexity of reality itself.

Object-oriented programming is especially rewarding that way. You can simulate anything you can imagine, describe its properties and its behaviours, then - by typing a single line of code - create a thousand (or a million) copies of it and set them all working. Then call that whole system an object and create a hundred copies of that. It's all stuff you can't do in the heavy, inertial, expensive world of matter: making plastic bits and pieces by 3D printing may be practical, even useful, but it lacks this Creator of Worlds buzz.

Since I'm so besotted by programming as recreation, I must surely be very excited by our government's "Year of Code" initiative, which aims to teach all our children how to write programs - or about "coding" as the current irritating locution would have it? Actually, no I'm not. I'm perfectly well aware that my taste for programming as recreation is pretty unusual, very far from universal, perhaps even eccentric, a bit like Base Jumping or worm farming. The idea that every child in the country is going to develop a such taste is ludicrous, and that rules out coding for pleasure as a rationale. It will most likely prove as unpleasant as maths to a lot of kids, and put them off for life.

But what about "coding" as job skill, as vital life equipment for gaining employment in our new digital era? Well there's certainly a need for a lot of programmers, and the job does pay well above average. However you can say much the same about plumbers, electricians and motor mechanics, and no-one is suggesting that all children should be taught those skills. The aim is to train school teachers to teach coding, but it makes no more sense for every child to learn programming than it does to wire up a ring-main or install a cistern. Someone who decides to pursue programming as a profession needs solid tuition in maths and perhaps physics, plus the most basic principles of programming like iteration and conditionality which ought to be part of the maths curriculum anyway. Actual programming in real languages is for tertiary education, not for the age of five as the Year of Code seeks.    

The whole affair reeks of the kind of gimmicky policy a bunch of arts and humanities graduates, clueless about technology, might think up after getting an iPad for Christmas and being bowled over by the wonderful new world of digital communications. Their kids probably already know more about "coding" than they do via self-tuition. However there are those who detect a more sinister odour in the air. For example Andrew Orlowski, curmudgeon-in-chief at The Register, has pointed out a network of training companies and consultants who stand to make big bucks out of the Year of Code, in much the same way firms did during the Y2K panic: they include venture capital company Index Ventures, which has Year of Code's chairman Rohan Silva as its "Entrepreneur in Residence", and US training company Codecademy. Organisations that are already reaching children who are actually interested in programming, like the Raspberry Pi foundation, appear to be sidelined and cold-shouldered by the hype merchants: the foundation's development director Clive Beale has claimed that "The word 'coding' has been hijacked and abused by politicians and media who don't understand stuff”.

Personally I'd love to believe all children could be taught to gain as much pleasure from programming as I do, but it's unlikely. Like singing, dancing, drawing, playing football, some can do it and like it, others can't and won't, and the notion that everyone has to be able do it for the sake of the national economy has unpleasantly Maoist undertones, with backyard code foundries instead of steelworks.

 

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...