Musk called the gathering “historic” and stressed that AI could pose an existential threat to humanity if not properly regulated. “It’s like, hey, this is something that’s potentially risky for all humans everywhere,” he remarked.
The bipartisan AI Insight Forum was hosted by Senate Majority Leader Chuck Schumer along with Senators Mike Rounds, Todd Young, and Martin Heinrich. It brought together leaders from the biggest US tech firms to advise Congress on regulating AI.
During the closed-door session, Musk used the dire phrase “civilizational risk” to describe uncontrolled AI. This sparked alarm among the senators present. When later asked if AI could destroy humanity, Musk said there was “some chance above zero” of an apocalyptic outcome, even if low.
Zuckerberg argued government alone is responsible for overseeing AI safely. “I agree Congress should engage with AI to support innovation and safeguards,” he stated. “This is an emerging technology, there are important equities to balance here, and the government is ultimately responsible for that.”
Other tech experts highlighted the need for clearer AI regulations and standards. They pushed issues like immigration reform to get more AI talent into America.
Senator Cynthia Lummis said she was struck by the frank dialogue. “You had everything from there to sort of the high-level comment about the civilizational risks associated with AI, which is a very 60,000-foot-level remark, and it was everything in between, so I thought it was surprisingly interesting and helpful,” she noted.
More AI forums will continue through the year as Congress explores bipartisan legislation. Senator Schumer said the rare consensus for government involvement makes clear “we have to try to act, as difficult as the process might be.”
With AI poised to disrupt society and business, lawmakers face mounting pressure to future-proof its development. But crafting effective, balanced guardrails will prove challenging. For now, raising awareness of AI’s double-edged potential appears a tentative first step.