London Futurists

The shocking problem of superintelligence, with Connor Leahy

October 25, 2023 London Futurists Season 1 Episode 61
The shocking problem of superintelligence, with Connor Leahy
London Futurists
More Info
London Futurists
The shocking problem of superintelligence, with Connor Leahy
Oct 25, 2023 Season 1 Episode 61
London Futurists

This is the second episode in which we discuss the upcoming Global AI Safety Summit taking place on 1st and 2nd of November at Bletchley Park in England.

We are delighted to have as our guest in this episode one of the hundred or so people who will attend that summit – Connor Leahy, a German-American AI researcher and entrepreneur.

In 2020 he co-founded Eleuther AI, a non-profit research institute which has helped develop a number of open source models, including Stable Diffusion. Two years later he co-founded Conjecture, which aims to scale AI alignment research. Conjecture is a for-profit company, but the focus is still very much on figuring out how to ensure that the arrival of superintelligence is beneficial to humanity, rather than disastrous.

Selected follow-ups:
https://www.conjecture.dev/
https://www.linkedin.com/in/connor-j-leahy/
https://www.gov.uk/government/publications/ai-safety-summit-programme/ai-safety-summit-day-1-and-2-programme
https://www.gov.uk/government/publications/ai-safety-summit-introduction/ai-safety-summit-introduction-html
An open event at Wilton Hall, Bletchley, the afternoon before the AI Safety Summit starts: https://www.meetup.com/london-futurists/events/296765860/

Music: Spike Protein, by Koi Discovery, available under CC0 1.0 Public Domain Declaration

The Neil Ashton Podcast

This podcast focuses on explaining the fascinating ways that science and engineering...

Listen on: Apple Podcasts   Spotify

Show Notes Chapter Markers

This is the second episode in which we discuss the upcoming Global AI Safety Summit taking place on 1st and 2nd of November at Bletchley Park in England.

We are delighted to have as our guest in this episode one of the hundred or so people who will attend that summit – Connor Leahy, a German-American AI researcher and entrepreneur.

In 2020 he co-founded Eleuther AI, a non-profit research institute which has helped develop a number of open source models, including Stable Diffusion. Two years later he co-founded Conjecture, which aims to scale AI alignment research. Conjecture is a for-profit company, but the focus is still very much on figuring out how to ensure that the arrival of superintelligence is beneficial to humanity, rather than disastrous.

Selected follow-ups:
https://www.conjecture.dev/
https://www.linkedin.com/in/connor-j-leahy/
https://www.gov.uk/government/publications/ai-safety-summit-programme/ai-safety-summit-day-1-and-2-programme
https://www.gov.uk/government/publications/ai-safety-summit-introduction/ai-safety-summit-introduction-html
An open event at Wilton Hall, Bletchley, the afternoon before the AI Safety Summit starts: https://www.meetup.com/london-futurists/events/296765860/

Music: Spike Protein, by Koi Discovery, available under CC0 1.0 Public Domain Declaration

The Neil Ashton Podcast

This podcast focuses on explaining the fascinating ways that science and engineering...

Listen on: Apple Podcasts   Spotify

(Cont.) The shocking problem of superintelligence, with Connor Leahy