“The innovation in science and technology, from stone blades in the Stone Age to the printing press and the atomic bomb, has always catered to human needs. However, AI weapons can now independently decide whom to bomb, and AI systems can even develop entirely new weapons on their own. This level of innovation poses an unprecedented threat unlike anything we have witnessed before,” said renowned Israeli historian Yuval Harari.
Harari visited South Korea for the first time in eight years to mark the publication of the Korean edition of “Nexus,” his latest book. His previous visit in 2017 coincided with the release of “Homo Deus.” During an interview in Jongno-gu, Seoul, he shared his insights on the future of AI and its implications for humanity.
“To tackle the challenges of AI, global cooperation is more crucial than ever in human history. However, trust between people is deteriorating at an alarming rate,” Harari said. “Despite having the most advanced information technology in history, meaningful conversations have become harder than ever.”
He pointed to AI-driven algorithms as a major culprit, arguing that they are designed to maximize user engagement rather than seek the truth. As a result, AI amplifies misinformation and content that fuels fear and hatred.
“Unlike traditional media such as newspapers and television, which verify information through rigorous editorial processes, truth often sinks to the bottom in today’s digital media landscape. Meanwhile, misinformation—cheap and visually appealing—dominates the space, posing a serious threat to democracy,” he warned.
Harari emphasized the need to solve what he calls the “paradox of trust” to prevent even greater AI-related risks.
“When asked why they are advancing AI so rapidly, industry leaders often respond, ‘If we slow down, our competitors will outpace us.’ This race fuels distrust and conflict. Instead, we must establish a framework that fosters mutual trust. AI trained in an environment of trust, rather than rivalry and suspicion, can be reliable,” he asserted.
Harari also urged individuals to reconsider how they engage with AI. “Just as our stomachs need time to digest food, we must process information rather than passively absorb it,” he advised.
이지윤기자 leemail@donga.com