The son of an old friend of mine is deep in the world of artificial intelligence and machine learning. Not so long ago, while trying to explain to me what he does, he started talking about transformer models (GPT-3 from OpenAI is probably the best known example):
They basically try and do away with every traditional analytic assumption and just throw mountains (read: Iceland sized countries, for the big models) of electricity into computing outcomes.
To put that into some perspective in relation to the power demands and flexibility of human cognition, Andy Crouch writes:
Furthermore, human babies accomplish all this cognition with the roughly one-hundred-watt power supply of the human body (a single training run for GPT-3, one set of researchers estimated, consumes 189,000 kWh of power, roughly what a human being would consume over an entire lifetime). How would we ever engineer a silicon-based system to use so little power to mobilize curiosity, engage relationally, and infer effortlessly from a few examples the shape of the learner’s world? Now we truly seem in the realm of the inconceivable.
– Andy Crouch, The life we’re looking for: reclaiming relationship in a technological world.