The last few episodes of our podcast have explored what GPT (generative pre-trained transformer) technology is and how it works, and also the call for a pause in the development of advanced AI. In this latest episode, Ted Lappas, a data scientist and academic, helps us to take a pragmatic turn - to understand what GPT technology can do for each of us individually.
Ted is Assistant Professor at Athens University of Economics and Business, and he also works at Satalia, which was London's largest independent AI consultancy before it was acquired last year by the media giant WPP.
Topics addressed in this episode include:
*) The "GPT paradox": If GPT-4 is so good, why aren't more people using it to boost their effectiveness in their workplace?
*) Concerns in some companies that data entered into GPTs will leak out and assist their competitors
*) Uses of GPTs to create or manipulate text, and to help developers to understand new code
*) GPTs as "brains" that lack the "limbs" that would make them truly useful
*) GPT capabilities are being augmented via plug-ins that access sites like Expedia, Instacart, or Zapier
*) Agent-based systems such as AutoGPT and AgentGPT that utilise GPTs to break down tasks into steps and then carry out these steps
*) Comparison with the boost given to Apple iPhone adoption by the launch, one year later, of the iOS App Store
*) Ted's use of GPT-4 in his role as a meta-reviewer for papers submitted to an academic conference - with Ted becoming an orchestrator more than a writer
*) The learning curve is easier for vanilla GPTs than for agent systems that use GPTs
*) GPTs are currently more suited to low-end writing than to high-end writing, but are expected to move up the value chain
*) Ways to configure a GPT so that it can reproduce the quality level or textual style of a specific writer
*) Calum's use of GPT-4 in his side-project as a travel writer
*) Ways to stop GPTs inventing false anecdotes
*) Some users of GPTs will lose all faith in them due to just a single hallucination
*) Teaching GPTs to say "I don't know" or to state their level of confidence about claims they make
*) Creating an embedding space search engine
*) The case for gaining a working knowledge of the programming language Python
*) The growth of technology-explainer videos on TikTok and Instagram
*) "Explain this to me like I'm ten years old"
*) The way to learn more about GPTs is to use them in a meaningful project
*) Learning about GPTs such as DALL-E or Midjourney that generate not text but images
*) Uses of GPTs for inpainting - blending new features into an image
*) The advantages of open source tools, such as those available on Hugging Face
*) Images will be largely solved in 2023; 2024 will be the year for video
*) An appeal to "dive in, the sooner the better"
Music: Spike Protein, by Koi Discovery, available under CC0 1.0 Public Domain Declaration
Listen on: Apple Podcasts Spotify