Despite relatively brief lives, linguist Edward Sapir and his protégé Benjamin Lee Whorf left great ideas that still influence thought about language.
The most controversial hypothesis attributed to them is that language influences thought. Thus if your native tongue lacks any concept of future tense, your society should be incapable of planning; if it lacks a word for pain, then your society should be unable to feel pain. This strongest form of their hypothesis is generally rejected today, although it remains pervasive in ‘folk’ linguistics, and is basis of Samuel R Delany’s SciFi novel Babel-17.
Curiously it was that novel which partly inspired Yukihiro Matsumoto to develop Ruby, in the hope that a better programming language would engender better programs.
Kenneth Iversen, a founding father of mathematical programming, also subscribed openly to the strong form of the Sapir-Whorf Hypothesis applied to computing. My gut feeling is that, however dubious its applicability to natural languages, the hypothesis is valid for programming.
Iversen designed an extraordinary language, APL, in which complex mathematical operations are encrypted in a string of Greek and alien characters. Hugely powerful and concise to the point of opacity, coding in APL is one of the most intellectually-demanding pursuits, and decoding well nigh impossible.
For over a decade, C and its derivatives have been most popular for applications and system development. Intended originally for low-level hacking of Unix systems, C is terse, gives ready access to memory and other dangerous things, and lures even the expert to err or write unsafe code.
A vogue to incorporate object-orientation led to C++, whose complexity makes it consummately easy to bury your flaws. C and C++ lack safety mechanisms that might protect applications from committing serious crimes, crashing, and burning – something that I have considered here and here.
Java was supposed to change all that, by preventing direct access to memory, eliminating the resulting propensity for bugs, and doing safe things like managing memory robustly. However it has proved cumbersome for many tasks, and runtime security flaws have become popular targets.
Apple has ploughed a lonely furrow. It first locked OS X and iOS developers into Objective C, which rights many of the wrongs in C++, but still offers ample opportunity to screw up. Now its loyal coders should switch to Swift, which is far more promising so long as you only wish to code for OS X and iOS.
Given the already vast choice of languages, such as the 1500 eloquently exemplified at 99-bottles-of-beer.net, you might be surprised to hear that there are two more jostling in the crowd, trying to fight their way to the front: Go from Google, and Rust from Mozilla Labs.
Rust, billed as ‘a safe, concurrent, practical language’, forces the programmer to isolate dangerous code into blocks that are declared to be ‘unsafe’, but does nothing to restrict the use of such blocks. Safe practice to prevent memory leaks is also on offer, but optional. Its syntax is derived from C, but introduces additional obscure notations.
Go, aimed at making programmers more productive, claims to be ‘expressive, concise, clean, and efficient.’ It cuts out some of the more controversial features of C++, which could help prevent bugs and security holes, and tries to relegate potentially dangerous actions to a special built-in Unsafe package. However its syntax is resolutely based on C, and it adopts a weak model for concurrent processes that could generate obscure bugs.
Staying true to the C family has not helped Go or Rust. According to the TIOBE language popularity list, which places Swift in 24th place just below PostScript, Go languishes around 42nd place, and Rust is nowhere, not even in the top 100. Both are available for OS X, but I’d be surprised to see either doing anything for your Mac, iPad, iPhone, or Watch.
Perhaps if their designers had read the work of Sapir and Whorf, they would have realised that to accomplish real change in programming would have required much greater change in the language used.
Updated from the original, which was first published in MacUser volume 28 issue 07, 2012.