Why AI Cannot Replace the Human Voice in Journalism
amsterdam, maandag, 3 november 2025.
One of the most striking warnings from historian Koos-jan de Jager is that writing is more than data processing: it is storytelling from a personal perspective. He cautions that overreliance on AI risks undermining depth, human interpretation, and personal experience in news reporting. According to him, journalism becomes dull when the unique human viewpoint is exchanged for generic, machine-generated text. His message is clear: AI can be a useful tool, but the heart of journalism remains the human story — something no algorithm can truly replicate.
The Human Core of Journalism: More Than Data Processing
Koos-jan de Jager, historian and lecturer in investigative journalism at the CHE, warns that relying on artificial intelligence in journalism may undermine the essential human element of storytelling. He emphasizes that writing is not merely the processing of facts and data, but the telling of a story from a unique perspective grounded in personal experience and human interpretation [1]. In his view, relying solely on AI-based information leaves one with a shallow understanding of the world behind the screen [1]. This warning comes amid an increasing debate about the role of AI in media, particularly in educational institutions training future journalists [1]. According to De Jager, the core of journalism lies in the ability to tell a story that is not only informative, but also emotional, ethically grounded, and contextually embedded — something he argues no algorithm can replicate [1].
AI as a Tool, Not a Replacement: Balancing Practice
Although Koos-jan de Jager is cautious about full dependence on AI, he acknowledges that it can be a valuable addition to a journalist’s toolkit [1]. The technology is already used for tasks such as summarising lengthy documents, generating initial drafts of reports based on data, or identifying patterns in large datasets — tasks that would otherwise consume significant time and energy for humans [1]. In practice, AI is used to assist journalists in gathering material, locating relevant sources, or generating preliminary text that is later corrected and expanded by human editors [1]. These applications can boost efficiency, especially for repetitive or data-intensive assignments such as election results or economic reports [1]. Nevertheless, De Jager stresses that the use of AI must not lead to the replacement of the human perspective — a personal view of the world shaped by life experience, ethical considerations, and cultural context [1].
Impact on News Production: Speed Versus Depth
The application of AI in news production has led to a notable acceleration in the pace at which stories are published. Platforms such as NRC.nl and Terdege.nl demonstrate how AI is employed to quickly generate reports based on data, such as election results or economic figures [2][1]. For example, during the 2025 election period, results were published across multiple media outlets within hours, often with support from AI tools [2]. This speed is a clear advantage, particularly in an era where news is expected to be available instantly [1]. However, this comes at the risk of sacrificing depth: when the focus shifts from depth to speed, critical contextual elements, historical background, or human consequences may be overlooked [1]. De Jager’s warning is that a story based solely on data, without human interpretation, becomes dull and impersonal — a critique echoed in multiple media forum discussions [1][4].
Ethics, Accountability, and the Threat of ‘Boredom’ in News
One of the greatest ethical concerns surrounding AI in journalism is the risk of losing accountability and authenticity. If a story is fully generated by an AI trained on data from Silicon Valley or China, its content may diverge from local contexts, cultural nuances, or historical facts [1]. De Jager points out that using a ‘generic computer-generated product’ instead of a human perspective leads to the erosion of the journalist’s unique voice [1]. This creates the danger of a ‘boring’ media landscape, where all articles look the same, lacking depth or emotional resonance [1]. Moreover, holding AI accountable for errors, misinformation, or bias present in training data is difficult, as it lacks consciousness or intention [1]. The topic of ethical aspects of AI in journalism was discussed in the Reformed Forum, active on 3 November 2025, with a focus on human interpretation as the cornerstone of responsible journalism [4]. This ethical dimension becomes increasingly important as AI is deployed in higher-level editorial roles.
Journalism Education: A Crossroads of Technology and Humanity
In education at the CHE, where Koos-jan de Jager teaches, a critical stance toward AI is central. Students are trained not only in research and writing techniques, but also in understanding their role as human storytellers. De Jager stresses that it is impossible to replace a journalist’s personal perspective with an algorithm — it is a matter of identity, lived experience, and responsibility [1]. Within the educational framework, AI is used as a tool for processing large volumes of data, but never as a substitute for critical thinking or narrative craft [1]. The debate over AI’s role in journalism among students remains active on platforms such as the Reformed Forum, where critical questions are raised about its use in the discipline [4]. Thus, the transition from traditional journalism to an AI-assisted practice requires not only technical skills, but also a strong ethical foundation developed through education [1].