Scalable Intelligence
The human brain is a marvellous machine. This machine ingests a constant stream of visual, audio, tactile and olfactory signals. These signals are processed in real time by around one hundred billion neurons, each one communicating, via a mixture of electrical and chemical messages, with up to ten thousand others. This network is globally serviced by one hundred fifty thousand kilometres of blood vessels carrying the nutrients that power the system, and clearing their residuals. The complexity of the system is staggering, yet, one of the most incredible facts about the brain, is that all this consumes less than 15 watts of energy. The entire information processing that makes humans the most intelligent life form in the world, and equips us with all our skills and abilities, consumes less energy than a single light bulb.
The hardware advances that we need to be able to achieve something of comparable efficiency are immense. Some may argue that until then, any attempt to build AI will hit a wall, due to the lack of access to suitable hardware. I disagree, and for multiple reasons.
First, not every artificial intelligence needs to match the capabilities of human intelligence to be useful. Intelligence, in the sense of an agent’s capability to demonstrate goal oriented behaviour in an environment, is useful at any scale. Just like worms, insects, and birds are marvellous in their own right, and demonstrate incredible abilities with a fraction of the resources that humans devote to our brains, small scale AI systems have the power to make countless smaller applications more adaptive, efficient and smarter. Not all problems are equally hard, nor do they all require information processing at the same incredible scale.
Furthermore, an artificial intelligence is not bound by the same energy and space constraints as humans: we don't need to fit the hardware powering AI in a box of about a litre in volume, nor we need to necessarily reduce its energy consumption to that of a single light bulb.
What we need to make progress on AI now, is therefore not access to the ultimate hardware, but it's instead scalability; and we need it along multiple dimensions. Our algorithms need to scale well with the amount of experience they get to learn from, with the complexity of the problem, and with the computational resources that are available. As long as we focus on scalable intelligence in all of these senses, we can make long term progress that will be useful today, and will also be capable of making use of the hardware we will have tomorrow.