Joscha Bach on GPT-3, achieving AGI, machine understanding and lots more 02:40 What’s missing in AI atm? Unified coherent model of reality 04:14 AI systems like GPT-3 behave as if they understand – what’s missing? 08:35 Symbol grounding – does GPT-3 have it? 09:35 GPT-3 for music generation, GPT-3 for image generation, GPT-3 for video generation 11:13 GPT-3 temperature parameter. Strange output? 13:09 GPT-3 a powerful tool for idea generation 14:05 GPT-3 as a tool for writing code. Will GPT-3 spawn a singularity? 16:32 Increasing GPT-3 input context may have a high impact 16:59 Identifying grammatical structure & language 19:46 What is the GPT-3 transformer network doing? 21:26 GPT-3 uses brute force, not zero-shot learning, humans do ZSL 22:15 Extending the GPT-3 token context space. Current Context = Working Memory. Humans with smaller current contexts integrate concepts over long time-spans 24:07 GPT-3 can’t write a good novel 25:09 GPT-3 needs to become sensitive to multi-modal sense data – video, audio, text etc 26:00 GPT-3 a universal chat-bot – conversations with God & Johann Wolfgang von Goethe 30:14 What does understanding mean? Does it have gradients (i.e. from primitive to high level)? 32:19 (correlation vs causation) What is causation? Does GPT-3 understand causation? Does GPT-3 do causation? 38:06 Deep-faking understanding 40:06 The metaphor of the Golem applied to civ 42:33 GPT-3 fine with a person in the loop. Big danger in a system which fakes understanding. Deep-faking intelligible explanations. 44:32 GPT-3 babbling at the level of non-experts 45:14 Our civilization lacks [More]