
London Futurists
Anticipating and managing exponential impact - hosts David Wood and Calum Chace
Calum Chace is a sought-after keynote speaker and best-selling writer on artificial intelligence. He focuses on the medium- and long-term impact of AI on all of us, our societies and our economies. He advises companies and governments on AI policy.
His non-fiction books on AI are Surviving AI, about superintelligence, and The Economic Singularity, about the future of jobs. Both are now in their third editions.
He also wrote Pandora's Brain and Pandora’s Oracle, a pair of techno-thrillers about the first superintelligence. He is a regular contributor to magazines, newspapers, and radio.
In the last decade, Calum has given over 150 talks in 20 countries on six continents. Videos of his talks, and lots of other materials are available at https://calumchace.com/.
He is co-founder of a think tank focused on the future of jobs, called the Economic Singularity Foundation. The Foundation has published Stories from 2045, a collection of short stories written by its members.
Before becoming a full-time writer and speaker, Calum had a 30-year career in journalism and in business, as a marketer, a strategy consultant and a CEO. He studied philosophy, politics, and economics at Oxford University, which confirmed his suspicion that science fiction is actually philosophy in fancy dress.
David Wood is Chair of London Futurists, and is the author or lead editor of twelve books about the future, including The Singularity Principles, Vital Foresight, The Abolition of Aging, Smartphones and Beyond, and Sustainable Superabundance.
He is also principal of the independent futurist consultancy and publisher Delta Wisdom, executive director of the Longevity Escape Velocity (LEV) Foundation, Foresight Advisor at SingularityNET, and a board director at the IEET (Institute for Ethics and Emerging Technologies). He regularly gives keynote talks around the world on how to prepare for radical disruption. See https://deltawisdom.com/.
As a pioneer of the mobile computing and smartphone industry, he co-founded Symbian in 1998. By 2012, software written by his teams had been included as the operating system on 500 million smartphones.
From 2010 to 2013, he was Technology Planning Lead (CTO) of Accenture Mobility, where he also co-led Accenture’s Mobility Health business initiative.
Has an MA in Mathematics from Cambridge, where he also undertook doctoral research in the Philosophy of Science, and a DSc from the University of Westminster.
London Futurists
Humanity's final four years? with James Norris
In this episode, we return to the subject of existential risks, but with a focus on what actions can be taken to eliminate or reduce these risks.
Our guest is James Norris, who describes himself on his website as an existential safety advocate. The website lists four primary organizations which he leads: the International AI Governance Alliance, Upgradable, the Center for Existential Safety, and Survival Sanctuaries.
Previously, one of James' many successful initiatives was Effective Altruism Global, the international conference series for effective altruists. He also spent some time as the organizer of a kind of sibling organization to London Futurists, namely Bay Area Futurists. He graduated from the University of Texas at Austin with a triple major in psychology, sociology, and philosophy, as well as with minors in too many subjects to mention.
Selected follow-ups:
- James Norris website
- Upgrade your life & legacy - Upgradable
- The 7 Habits of Highly Effective People (Stephen Covey)
- Beneficial AI 2017 - Asilomar conference
- "...superintelligence in a few thousand days" - Sam Altman blogpost
- Amara's Law - DevIQ
- The Probability of Nuclear War (JFK estimate)
- AI Designs Chemical Weapons - The Batch
- The Vulnerable World Hypothesis - Nick Bostrom
- We Need To Build Trustworthy AI Systems To Monitor Other AI: Yoshua Bengio
- Instrumental convergence - Wikipedia
- Neanderthal extinction - Wikipedia
- Matrioshka brain - Wikipedia
- Will there be a 'WW3' before 2050? - Manifold prediction market
- Existential Safety Action Pledge
- An Urgent Call for Global AI Governance - IAIGA petition
- Build your survival sanctuary
Other people mentioned include:
- Eliezer Yudkowsky, Roman Yampolskiy, Yan LeCun, Andrew Ng
Music: Spike Protein, by Koi Discovery, available under CC0 1.0 Public Domain Declaration
Promoguy Talk PillsAgency in Amsterdam dives into topics like Tech, AI, digital marketing, and more drama...
Listen on: Apple Podcasts Spotify
Digital Disruption with Geoff NielsonDiscover how technology is reshaping our lives and livelihoods.
Listen on: Apple Podcasts Spotify