The End of the World is IT

John Martin, The Great Day of his Wrath (1851-3), oil on canvas, 196.5 x 303.2 cm, The Tate Gallery, London. Wikimedia Commons.

As visual artists started to record their travels through scenery made dramatic by nature or man, so they rendered some into visions of the Apocalypse. Gone were the disturbingly grisly scenes of torment detailed by Hieronymus Bosch, replaced by tectonic waves and clouds of colour in John Martin’s The Great Day of his Wrath (1851-3) above, and of course JMW Turner’s awe-inspiring depictions of the St Gotthard pass (The Passage of Mount St Gothard. Taken from the Centre of the Teufels Broch (Devil’s Bridge), Switzerland, 1804) and trans-Alpine routes.

Joseph Mallord William Turner, The Passage of Mount St Gothard. Taken from the Centre of the Teufels Broch (Devil's Bridge), Switzerland (1804), watercolour and scraping out on paper, 98.9 x 68.6 cm, Abbot Hall Art Gallery, Kendal, Cumbria, England. WikiArt.
Joseph Mallord William Turner, The Passage of Mount St Gothard. Taken from the Centre of the Teufels Broch (Devil’s Bridge), Switzerland (1804), watercolour and scraping out on paper, 98.9 x 68.6 cm, Abbot Hall Art Gallery, Kendal, Cumbria, England. WikiArt.

Perhaps climate change needs a visionary artist to bring it home to us all that – regardless of the political intrigues and controversies – we do face serious, maybe truly apocalyptic problems. Yet the parochial behaviour of the world’s political leaders at annual UN Climate Change Conferences show that they still do not accept the imperatives. It will be fascinating to see whether the next, scheduled to be held in Paris during the run-up to Christmas, will bring about any real change.

For the overwhelming majority of businesses, climate change has to be someone else’s problem, and the computing industry is no exception. All of the Next Big Things on the computing horizon seem set to increase our generation of greenhouse gases, because if they do not, someone loses out on new product sales. When personal computing started to take over from mainframes in the 1980s, millions of offices drew more power to run their new computers, and (where available) more power to remove the heat that they produced.

In theory, Cloud computing could produce great economies. Most firms could strip out their server suites, replacing them with capacious pipes connecting to remote data centres. Desktop systems with multiple hard disks and a handful of processor cores could go, in favour of ‘thin’ terminals. That should have happened when network operating systems became widespread in the 1990s, but hardware manufacturers fought back to avoid the fall in revenue. The last decade saw the rise of mobile devices, but how few of us have replaced our desktop systems with a laptop, let alone replaced that laptop with an iPhone? Even in recession we clung on to all the hardware that we could squeeze onto our desk and into our pockets.

No matter where you look in the computing industry, the drive is to deliver something new and compelling, that will charm or cajole more money out of the customer. Apple has excelled at doing that, from 68K Macs to laser printers, Power Macs, Intel Macs, iPods, iPhones, and iPads. Like every other successful Silicon Valley corporation, it has kept convincing us that we need to keep buying new and better products. In contrast the car industry has fallen into a slump of its own making, failing to innovate early enough to inspire consumption.

But the name given to the area around Apple’s headquarters, Silicon Valley, tells of the precious natural resources that innovation is consuming, of the high cost of energy needed to transform silica and rare metals into superfast processors, and so into tomorrow’s hot products. Although we are always told that it is the software that makes the machine, each new major product release demands wasteful hardware upgrade or replacement instead of making better use of existing resources.

As motor manufacturers are belatedly grappling with the need for energy-efficiency and greener power systems, the time has come for the computer industry to curb its reckless waste. This is not just a matter of slapping Energy Star stickers on monitors and cutting back on packaging, but demands a fundamental change in the industry’s development model.

While it may not be current generations that witness the consequences, I hope that our grandchildren will not curse the legacy of our technological excesses.

Updated from the original, which was first published in MacUser volume 26 issue 5, 2009. I cannot think of anything that has changed significantly over those six years to address these issues. Can you?