Why have languages not (yet) evolved to be more robust?
The effects of small errors with large consequences have exercised the minds of computer and information scientists, for example. Binary encoding in 1s and 0s is a good illustration, where in normal binary numbers a single bit error can make a difference of several billion. You can instead opt for ‘Gray’ code in which altering any single digit makes little difference, or use checksums to detect error.
There is nothing like this in most natural languages. Change a single letter and you can negate meaning:
he is now fit
he is not fit
often catches me out when writing medical reports. Or you can get your plosive voicing wrong to transform a delectable ‘crab sandwich’ into the less appealing ‘crap sandwich’.
Homophones can be a real nightmare in spoken language too. Now that many computer-like devices incorporate ‘smart’ spell-checking and auto-completion of words, even the meticulously accurate writer can come a cropper.
Today I saw a yacht whose name advertised the website of the sailing school which operated it: girlsforsail.com – a head-turning piece of text, but prey to over-correction.
I think that Hermes (Greek god of language), Mnemosyne (goddess), or Thoth (Egyptian god) are just too tickled pink at our errors to allow such evolution. Or do you know of a language which proves me wrong?