I expected my first internship to be baptism by fire. As the most junior doctor to two teams of neurosurgeons, I knew I’d learn plenty of new skills, among them performing lumbar punctures. Within the first few days I had been guided through that, and for much of the next six months I averaged one every couple of days. In the 45 years since I completed that job, I haven’t performed another lumbar puncture, ventricular tap, tracheostomy, or any of the other techniques I had learned. I suppose if it was a matter or life or death, I could just about remember how to drill a burr hole in an emergency, but all those other skills have now faded, some in a matter of months.
Skill fade is a distinctively animal trait, and a function of our brain. It comes in degrees: the slight fade you get from a good vacation is quickly overcome once you’ve got your feet back under the desk; more noticeable amounts from a longer maternity or sickness absence might merit a couple of weeks ‘returning to work’; and after a year or two you’ll probably need a period of formal retraining.
For the last year or so there have been increasing concerns raised over the effects of AI on critical thinking, and the Harvard Gazette published an interesting range of opinions last November. There has been extensive discussion about the dangers of ‘cognitive atrophy’ and impairment of critical thought, but less about longer-term skill fade.
I write code because I enjoy doing so. I’m not good at coding by any means, but over the forty years that I’ve been learning to code I have had a great deal of pleasure. It’s a creative act, like painting, involving a rich range of cognitive skills including plenty of art. At the end you have created something of substance, that might also benefit others.
So when someone comes along and advises me to start using Claude or another AI to write code for me, I can’t understand why I might want to stop coding and learn how to brief something else to steal my pleasure, any more than I might ask an AI to make me a painting. Moreover, were I to hand over one of my pleasures in life to AI, I know I’d find it progressively harder to code myself. While I might grow increasingly skilled at getting the AI to do much of the work, I’d also become increasingly dependent on its coding skills rather than mine.
At my age, that would remove one of my defences against the onset of dementia, and free up time to go painting more often. But what would it mean to a young engineer at the start of what they intend to be a bright career? At a time when their skills should only be developing, they’d be letting them fade. And who is going to have skills to transfer when they teach the next generation?
This extends beyond coding. Many of us are handing our writing to AI for it to summarise, one of its undisputed strengths. I started learning to write summaries before I turned 11, and have continued to develop and refine those skills for 60 years. If you’re only 20 now and leave this task to your favourite AI, how long before your summarising skills fade away?
Of course the vendors of AI want your dependence on their products. For a modest $200 to $3,600 a year you can abandon most of your independent skills to Claude, ChatGPT or Grok. If that was investing in further development of your skills, I could see the sense in that. While there are plenty of substitutes for cognitive challenges and critical thought you’re getting AI to do, there’s no substitute for developing and maintaining your essential professional skills.
I’m not advocating that you should avoid AI altogether; there are times when it has its uses, and skilful use of any tool can always be turned to advantage. But if you write code, summaries or whole novels, you need to retain and develop your own skills alongside that. Like morphine, AI has great powers, but overused it can so readily become both addictive and destructive.
