AI: The Dangerous Path to Human Enhancement 🦿🔥
Yuval Noah Harari is an Israeli public intellectual, historian and professor in the Department of History at the Hebrew University of Jerusalem. He is the author of the popular science bestseller Sapiens: A Brief History of Humankind.
⮞Key Takeaways on Not-So-Distant Future of Mankind:
💣 There is a possibility that our species, Homo sapiens, may not exist in a century or two due to potential threats.
🤖 Giving too much power to AI that becomes uncontrollable poses a significant risk to our survival.
🔮 If we manage to survive, future generations may have immense power to alter themselves using various technologies.
🧬 The descendants of Homo sapiens could be so different from us that they would no longer be recognizable as the same species, even more distinct than the difference between Homo sapiens and Neanderthals.
🔥 The development of advanced technologies carries the danger of misuse, with potential downgrades rather than upgrades to human qualities.
💡 Corporations, armies, and ruthless politicians could control the direction of human enhancement, focusing on certain qualities while neglecting vital aspects such as compassion, autistic sensitivity, and spirituality.
Link to the full podcast:
https://www.youtube.com/watch?v=Mde2q7GFCrw&t=501s&ab_channel=LexFridman
DISCLAIMER: This channel is not created, operated or in any form endorsed by Yuval Noah Harari or Lex Fridman. I am just sharing theirs content.
Copywrite disclaimer:
Fair Use Disclaimer Copyright disclaimer under section 107 of the Copyright Act 1976, allowance is made for "fair use" for purposes such as criticism, commenting, news reporting, teaching, scholarship and research. Fair use is a use permitted by copyright statute that might otherwise be infringing.
#ai #yuvalnoahharari #sapiens #future #awareness #world #danger
Yuval Noah Harari is an Israeli public intellectual, historian and professor in the Department of History at the Hebrew University of Jerusalem. He is the author of the popular science bestseller Sapiens: A Brief History of Humankind.
⮞Key Takeaways on Not-So-Distant Future of Mankind:
💣 There is a possibility that our species, Homo sapiens, may not exist in a century or two due to potential threats.
🤖 Giving too much power to AI that becomes uncontrollable poses a significant risk to our survival.
🔮 If we manage to survive, future generations may have immense power to alter themselves using various technologies.
🧬 The descendants of Homo sapiens could be so different from us that they would no longer be recognizable as the same species, even more distinct than the difference between Homo sapiens and Neanderthals.
🔥 The development of advanced technologies carries the danger of misuse, with potential downgrades rather than upgrades to human qualities.
💡 Corporations, armies, and ruthless politicians could control the direction of human enhancement, focusing on certain qualities while neglecting vital aspects such as compassion, autistic sensitivity, and spirituality.
Link to the full podcast:
https://www.youtube.com/watch?v=Mde2q7GFCrw&t=501s&ab_channel=LexFridman
DISCLAIMER: This channel is not created, operated or in any form endorsed by Yuval Noah Harari or Lex Fridman. I am just sharing theirs content.
Copywrite disclaimer:
Fair Use Disclaimer Copyright disclaimer under section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, commenting, news reporting, teaching, scholarship and research. Fair use is a use permitted by copyright statute that might otherwise be infringing.
#ai #yuvalnoahharari #sapiens #future #awareness #world #danger