London Futurists

The terabrain is near, with Simon Thorpe

London Futurists Season 1 Episode 9

Why do human brains consume much less power than artificial neural networks? Simon Thorpe, Research Director of CNRS, explains his view that the key to artificial general intelligence is a "terabrain" that copies from human brains the sparse-firing networks with spiking neurons.

00.11 Recapping "the AI paradox"
00.28 The nervousness of CTOs regarding AI
00.43 Introducing Simon
01.43 45 years since Oxford, working out how the brain does amazing things
02.45 Brain visual perception as feed-forward vs. feedback
03.40 The ideas behind the system that performed so well in the 2012 ImageNet challenge
04.20 The role of prompts to alter perception
05.30 Drawbacks of human perceptual expectations
06.05 The video of a gorilla on the basketball court
06.50 Conjuring tricks and distractions
07.10 Energy consumption: human neurons vs. artificial neurons
07.26 The standard model would need 500 petaflops
08.40 Exaflop computing has just arrived
08.50 30 MW vs. 20 W (less than a lightbulb)
09.34 Companies working on low-power computing systems
09.48 Power requirements for edge computing
10.10 The need for 86,000 neuromorphic chips?
10.25 Dense activation of neurons vs. sparse activation
10.58 Real brains are event driven
11.16 Real neurons send spikes not floating point numbers
11.55 SpikeNET by Arnaud Delorme
12.50 Why are sparse networks studied so little?
14.40 A recent debate with Yann LeCun of Facebook and Bill Dally of Nvidia
15.40 One spike can contain many bits of information
16.24 Revisiting an experiment with eels from 1927 (Lord Edgar Adrian)
17.06 Biology just needs one spike
17.50 Chips moved from floating point to fixed point
19.25 Other mentions of sparse systems - MoE (Mixture of Experts)
19.50 Sparse systems are easier to interpret
20.30 Advocacy for "grandmother cells"
21.23 Chicks that imprinted on yellow boots
22.35 A semantic web in the 1960s
22.50 The Mozart cell
23.02 An expert system implemented in a neural network with spiking neurons
23.14 Power consumption reduced by a factor of one million
23.40 Experimental progress
23.53 Dedicated silicon: Spikenet Technology, acquired by BrainChip
24.18 The Terabrain Project, using standard off-the-shelf hardware
24.40 Impressive recent simulations on GPUs and on a MacBook Pro
26.26 A homegrown learning rule
26.44 Experiments with "frozen noise"
27.28 Anticipating emulating an entire human brain on a Mac Studio M1 Ultra
28.25 The likely impact of these ideas
29.00 This software will be given away
29.17 Anticipating "local learning" without the results being sent to Big Tech
30.40 GPT-3 could run on your phone next year
31.12 Our interview next year might be, not with Simon, but with his Terabrain
31.22 Our phones know us better than our spouses do

Simon's academic page: https://cerco.cnrs.fr/page-perso-simon-thorpe/

Simon's personal blog: https://simonthorpesideas.blogspot.com/

Audio engineering by Alexander Chace.

Music: Spike Protein, by Koi Discovery, available under CC0 1.0 Public Domain Declaration

People on this episode