『Your Undivided Attention』のカバーアート

Your Undivided Attention

Your Undivided Attention

著者: Tristan Harris and Aza Raskin The Center for Humane Technology
無料で聴く

このコンテンツについて

Join us every other Thursday to understand how new technologies are shaping the way we live, work, and think. Your Undivided Attention is produced by Senior Producer Julia Scott and Researcher/Producer is Joshua Lash. Sasha Fegan is our Executive Producer. We are a member of the TED Audio Collective.2019-2025 Center for Humane Technology 人間関係 政治・政府 政治学 社会科学
エピソード
  • Daniel Kokotajlo Forecasts the End of Human Dominance
    2025/07/17

    In 2023, researcher Daniel Kokotajlo left OpenAI—and risked millions in stock options—to warn the world about the dangerous direction of AI development. Now he’s out with AI 2027, a forecast of where that direction might take us in the very near future.

    AI 2027 predicts a world where humans lose control over our destiny at the hands of misaligned, super-intelligent AI systems within just the next few years. That may sound like science fiction but when you’re living on the upward slope of an exponential curve, science fiction can quickly become all too real. And you don’t have to agree with Daniel’s specific forecast to recognize that the incentives around AI could take us to a very bad place.

    We invited Daniel on the show this week to discuss those incentives, how they shape the outcomes he predicts in AI 2027, and what concrete steps we can take today to help prevent those outcomes.

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X: @HumaneTech_. You can find a full transcript, key takeaways, and much more on our Substack.

    RECOMMENDED MEDIA
    The AI 2027 forecast from the AI Futures Project

    Daniel’s original AI 2026 blog post

    Further reading on Daniel’s departure from OpenAI

    Anthropic recently released a survey of all the recent emergent misalignment research

    Our statement in support of Sen. Grassley’s AI Whistleblower bill

    RECOMMENDED YUA EPISODES

    The Narrow Path: Sam Hammond on AI, Institutions, and the Fragile Future
    AGI Beyond the Buzz: What Is It, and Are We Ready?

    Behind the DeepSeek Hype, AI is Learning to Reason
    The Self-Preserving Machine: Why AI Learns to Deceive

    Clarification: Daniel K. referred to whistleblower protections that apply when companies “break promises” or “mislead the public.” There are no specific private sector whistleblower protections that use these standards. In almost every case, a specific law has to have been broken to trigger whistleblower protections.


    続きを読む 一部表示
    38 分
  • Is AI Productivity Worth Our Humanity? with Prof. Michael Sandel
    2025/06/26

    Tech leaders promise that AI automation will usher in an age of unprecedented abundance: cheap goods, universal high income, and freedom from the drudgery of work. But even if AI delivers material prosperity, will that prosperity be shared? And what happens to human dignity if our labor and contributions become obsolete?

    Political philosopher Michael Sandel joins Tristan Harris to explore why the promise of AI-driven abundance could deepen inequalities and leave our society hollow. Drawing from his landmark work on justice and merit, Sandel argues that this isn't just about economics — it's about what it means to be human when our work role in society vanishes, and whether democracy can survive if productivity becomes our only goal.

    We've seen this story before with globalization: promises of shared prosperity that instead hollowed out the industrial heart of communities, economic inequalities, and left holes in the social fabric. Can we learn from the past, and steer the AI revolution in a more humane direction?

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X: @HumaneTech_. You can find a full transcript, key takeaways, and much more on our Substack.

    RECOMMENDED MEDIA

    The Tyranny of Merit by Michael Sandel

    Democracy’s Discontent by Michael Sandel

    What Money Can’t Buy by Michael Sandel

    Take Michael’s online course “Justice”

    Michael’s discussion on AI Ethics at the World Economic Forum

    Further reading on “The Intelligence Curse”

    Read the full text of Robert F. Kennedy’s 1968 speech

    Read the full text of Dr. Martin Luther King Jr.’s 1968 speech

    Neil Postman’s lecture on the seven questions to ask of any new technology

    RECOMMENDED YUA EPISODES

    AGI Beyond the Buzz: What Is It, and Are We Ready?

    The Man Who Predicted the Downfall of Thinking

    The Tech-God Complex: Why We Need to be Skeptics

    The Three Rules of Humane Tech

    AI and Jobs: How to Make AI Work With Us, Not Against Us with Daron Acemoglu

    Mustafa Suleyman Says We Need to Contain AI. How Do We Do It?

    続きを読む 一部表示
    47 分
  • The Narrow Path: Sam Hammond on AI, Institutions, and the Fragile Future
    2025/06/12

    The race to develop ever-more-powerful AI is creating an unstable dynamic. It could lead us toward either dystopian centralized control or uncontrollable chaos. But there's a third option: a narrow path where technological power is matched with responsibility at every step.

    Sam Hammond is the chief economist at the Foundation for American Innovation. He brings a different perspective to this challenge than we do at CHT. Though he approaches AI from an innovation-first standpoint, we share a common mission on the biggest challenge facing humanity: finding and navigating this narrow path.

    This episode dives deep into the challenges ahead: How will AI reshape our institutions? Is complete surveillance inevitable, or can we build guardrails around it? Can our 19th-century government structures adapt fast enough, or will they be replaced by a faster moving private sector? And perhaps most importantly: how do we solve the coordination problems that could determine whether we build AI as a tool to empower humanity or as a superintelligence that we can't control?

    We're in the final window of choice before AI becomes fully entangled with our economy and society. This conversation explores how we might still get this right.

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X: @HumaneTech_. You can find a full transcript, key takeaways, and much more on our Substack.

    RECOMMENDED MEDIA

    Tristan’s TED talk on the Narrow Path

    Sam’s 95 Theses on AI

    Sam’s proposal for a Manhattan Project for AI Safety

    Sam’s series on AI and Leviathan

    The Narrow Corridor: States, Societies, and the Fate of Liberty by Daron Acemoglu and James Robinson

    Dario Amodei’s Machines of Loving Grace essay.

    Bourgeois Dignity: Why Economics Can’t Explain the Modern World by Deirdre McCloskey

    The Paradox of Libertarianism by Tyler Cowen

    Dwarkesh Patel’s interview with Kevin Roberts at the FAI’s annual conference

    Further reading on surveillance with 6G

    RECOMMENDED YUA EPISODES

    AGI Beyond the Buzz: What Is It, and Are We Ready?

    The Self-Preserving Machine: Why AI Learns to Deceive

    The Tech-God Complex: Why We Need to be Skeptics

    Decoding Our DNA: How AI Supercharges Medical Breakthroughs and Biological Threats with Kevin Esvelt

    CORRECTIONS

    Sam referenced a blog post titled “The Libertarian Paradox” by Tyler Cowen. The actual title is the “Paradox of Libertarianism.”

    Sam also referenced a blog post titled “The Collapse of Complex Societies” by Eli Dourado. The actual title is “A beginner’s guide to sociopolitical collapse.”

    続きを読む 一部表示
    48 分

Your Undivided Attentionに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。