Monday 16 November 2015

BITS STILL AIN'T ATOMS

Dick Pountain/Idealog 251/07 June 2015 13:48

I'd started to write that I'm as fond of gadgets as the next man, but in truth I'm only as fond as the one after the one after him (which is still fairly fond). For example I get enormous pleasure from my recently-acquired Zoom G1on guitar effects pedal, frightening the neighbours with my PhaseFunk twangs. However I've resisted the hottest of today's gadgets, the 3D printer, with relative ease. Partly it's because I have no pressing need for one: being neither a vendor of cornflakes nor a devotee of fantasy games or toy soldiers I just don't need that many small plastic objects. I can see their utility for making spare parts for veteran mechanical devices, but I don't do that either. What deters me more though is the quasi-religious atmosphere that has enveloped 3D printing, as typified by those reverential terms "making" and "maker". People desperately want to bridge the gap between digital representation and real world, between CGI fantasy and life, and they've decided 3D printing is a step on the way, but if so it's a tiny step toward a very short bridge that ends in mid-air.

One problem is precisely that 3D printing tries to turn bits into atoms, but pictures don't contain the internal complexity of reality. Serious applications of 3D printing are, for example, the aerospace industry where components can be printed in sintered metal quicker, more cheaply and of greater geometric complexity than by traditional forging or casting techniques. Even so two things remain true: such parts are typically homogeneous (all the same metal) and made in relatively small quantities since 3D printing is slow - if you need 100,000 of something then 3D print one and make a mold from it for conventional casting. Printing things with internal structure of different materials is becoming possible, but remains topologically constrained to monolithic structures.  

That's the second problem, that 3D printing encourages thinking about objects as monolithic rather than modular. Modularity is a profound property of the world, in which almost every real object is composed from smaller independent units. In my Penguin Dictionary of Computing I said: "modules must be independent so that they can be constructed separately, and more simply than the whole. For instance it is much easier to make a brick than a house, and many different kinds of house can be made from standard bricks, but this would cease to be true if the bricks depended upon one another like the pieces of a jigsaw puzzle." The basic module in 3D printing is a one-bit blob firmly attached to the growing object.

I recently watched a YouTube video about a project to 3D print mud houses for developing countries, and it was undeniably fascinating to watch the print head deposit mud (slowly) in complex curves like a wasp building its nest. But it struck me that, given the computing power attached to that printer, it would be faster to design a complex-curved brick mold, print some and then fill them with mud and assemble the houses manually.

The ultimate example of modularity, as I never tire of saying, is the living cell, which has a property that's completely missing from all man-made systems: every single cell contains not only blueprints and stored procedures for building the whole organism, but also the complete mechanism for reproducing itself. This mind-boggling degree of modularity is what permitted evolution to operate, by accidentally modifying the blueprints, and which has lead to the enormous diversity of living beings. No artificial "maker" system can possibly approach this status so long as fabrication remains homogeneous and monolithic, and once you do introduce heterogeneous materials and internal structure you'll start to confront insuperable bandwidth barriers as an exponentially-exploding amount of information must be introduced from outside the system rather than being stored locally. A machine that can make a copy of itself seems to really impress the maker community, but you just end up with a copy of that machine. A machine that copies itself, then makes either an aeroplane, or a bulldozer, or a coffee machine out of those copies is some way further down the road.

I was lead to these thoughts recently while watching Alex Garland's excellent movie Ex Machina. In its marvellous denouement the beautiful robot girl Ava kills her deeply unpleasant maker and escapes into the outside world to start a new, independent life, but first she has to replace her arm, damaged in the final struggle, with a spare one. Being self-repairing at that level of granularity is feeble by biological standards, and as she stood beaming at a busy city intersection it struck me that such spare parts would be in short supply at the local hospital...  

STRICT DISCIPLINARIAN

Dick Pountain/Idealog 250/05 May 2015 11:23

After photography my main antidote to computer-trauma is playing the guitar. Recently I saw Stefan Grossman play live for the first time at London's King's Place, though I've been learning ragtime picking from his books for the last 30 years. He played his acoustic Martin HJ-38 through a simple PA mike, and played it beautifully. Another idol of mine is Bill Frisell, who could hardly be more different in that he employs the whole gamut of electronic effects, on material from free jazz, through bluegrass to surf-rock. Dazzled by his sound I just purchased a Zoom G1on effects pedal from Amazon, and am currently immersed in learning how to deploy its 100 amazing effects.

The theme I'm driving at here is the relationship between skill, discipline and computer-assistance. There will always of course be neo-Luddites who see the computer as the devil's work that destroys all skills, up against pseudo-modernists who believe that applying a  computer to any banal material will make it into art. Computers are labour-savers: they can be programmed to relieve humans of certain repetitive tasks and thereby reduce their workload. But what happens when that repetitive task is practising to acquire a skill like painting or playing a musical instrument?

The synth is a good example. When I was a kid learning to play the piano took years, via a sequence of staged certificates, but now you can buy a keyboard that lets you play complex chords and sequences after merely perusing the manual. Similarly if you can't sing in tune a not-that-inexpensive Auto-Tune box will fudge that for you. Such innovations have transformed popular music, and broadened access to performing it, over recent decades. Does that make it all rubbish? Not really, it's only around 80% rubbish, like every other artform. The 20% that isn't rubbish is made by people who still insist on discovering all the possibilities and extending their depth, whether that's in jazz, hiphop, r&b, dance or whatever.

Similar conflicts are visible with regard to computer programming itself. I've always maintained that truly *great* programming is an art, structurally not that unlike musical composition, but the vast majority of the world's software can't be produced by great programmers. One of my programming heroes, Prof Tony Hoare, has spent much of his career advocating that programming should become a chartered profession, like accountancy, in the interests of public safety since so much software is now mission-critical. What we got instead is the "coding" movement which encourages absolutely everybody to start writing apps using web-based frameworks: my favourite Guardian headline last month was "Supermodels join drive for women to embrace coding". Of course it's a fine idea to improve everyone's understanding of computers and help them make their own software, but such a populist approach doesn't teach the really difficult disciplines involved in creating safe software: it's more like assembling Ikea furniture, and if that table-leg has an internal flaw your table's going to fall over.

Most important of all though, there's a political-economic aspect to all this. Throughout most of history, up until the last century, spending years acquiring a skill like blacksmithing, barbering, medicine, singing, portrait painting might lead to some sort of a living income, since people without that skill would pay you to perform it for them. Computerised deskilling now threatens that income stream in many different fields. Just to judge from my own friends, the remuneration of graphic designers, illustrators, photographers and animators has taken a terrible battering in recent years, due to digital devices that opened up their field and flooded it with, mostly mediocre, free content. The arguments between some musicians and Spotify revolves around a related issue, not of free content but of the way massively simplified distribution reduces the rates paid.

We end up crashing into a profound contradiction in the utilitarian philosophy that underlies all our rich Western consumer societies, which profess to seek the greatest good for the greatest number: does giving more and more people ever cheaper, even free, artefacts trump the requirement to pay those who produce such artefacts a decent living? I think any sensible solution probably revolves around that word "decent": what exactly constitutes a decent living, and who or what decides it? Those rock stars who rail against Spotify aren't sore because their children are starving, but because of some diminution in what most would regard as plutocratic mega-incomes. Some people will suggest that it's market forces that sort out such problems (and of course that's exactly what Spotify is doing). I've no idea what Stefan Grossman or Bill Frisell earn per annum, but I don't begrudge them a single dollar of it and I doubt that I'm posing much of threat to either of them  (yet).

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...