|||

power of a lifetime

The son of an old friend of mine is deep in the world of artificial intelligence and machine learning. Not so long ago, while trying to explain to me what he does, he started talking about transformer models (GPT-3 from OpenAI is probably the best known example):

They basically try and do away with every traditional analytic assumption and just throw mountains (read: Iceland sized countries, for the big models) of electricity into computing outcomes.

To put that into some perspective in relation to the power demands and flexibility of human cognition, Andy Crouch writes:

Furthermore, human babies accomplish all this cognition with the roughly one-hundred-watt power supply of the human body (a single training run for GPT-3, one set of researchers estimated, consumes 189,000 kWh of power, roughly what a human being would consume over an entire lifetime). How would we ever engineer a silicon-based system to use so little power to mobilize curiosity, engage relationally, and infer effortlessly from a few examples the shape of the learner’s world? Now we truly seem in the realm of the inconceivable.

– Andy Crouch, The life we’re looking for: reclaiming relationship in a technological world.

Up next now: 4 January 2023 Going on for me right now: Reading: Writing Dance by Jonathan Burrows; The Feeling of What Happens by Antonio Damasio; Ministry for the Future by Colin, Simon and I archive
Latest posts the end of nature thinking like a consumer eliminate the friction Look and Look Again astray awkwardly sign on the door ask nature ecosytemic practice research self portrait as time the comfort/chaos circle things will have to change ladder of inference physical connection berry on minimalism stimming the body isn’t a thing postcards no country your morals eating irritating in others awakened transfiguration bits of unsolicited advice stockdale paradox hands that don’t want anything singing and dancing losing oneself given a price on remembering everything Godin on ideas