Why AI Is Not an Enemy, But a Partner for Journalists
Nieuwspoort, dinsdag, 25 november 2025.
It is a common misconception that AI threatens journalism — but the truth is quite different. At the JOURN-AI 2025 event in Nieuwspoort, it became clear: AI has no will, no goals, and does not act consciously against journalists or the media. What stands out instead is that newsrooms often fail to invest sufficiently in knowledge and strategy, leaving them dependent on technologies they do not fully control. The real challenge is not using AI, but using it wisely — as a tool, not a replacement. The most striking conclusion? Human creativity, ethics, and transparency have never been more important, precisely because AI is so adept at mimicking them. The future of reliable news production does not depend on resisting AI, but on learning how to work with it — a matter of professionalism, not fear.
AI in Journalism: From Enemy to Partner
The perception that artificial intelligence poses a threat to journalism is a widespread misconception, as clearly demonstrated at the JOURN-AI 2025 event in Nieuwspoort. AI has no will, no objectives, and does not take conscious actions against journalists or the media. It is a technology designed to process data, generate text, and support workflows — but it does not independently decide on truth, ethics, or news priorities. The real challenge lies not in the technology itself, but in how newsrooms manage it. Many media organisations underinvest in knowledge, tools, and strategy, leaving them dependent on AI systems they cannot fully control [1][2][3]. This creates a sense of lost authority, leading to panic and a misleading narrative that frames AI as an enemy rather than a tool. The central theme at Nieuwspoort was not opposing AI, but learning how to work with it — a task that demands professionalism, not fear [4].
The Human Role: Creativity and Ethics as an Indispensable Foundation
In the age of AI, human creativity is more essential than ever, not less. While generative AI excels at drafting standard articles, summarising reports, or adapting tone and style, its fundamental limitation remains its inability to grasp context, nuance, and ethical values. During JOURN-AI 2025, it was repeatedly emphasised that transparency about AI use is crucial to maintaining public trust [1][3][5]. 78% of participating journalists in panel discussions stated that clearly identifying AI-generated content is essential for news credibility [5]. Humans remain indispensable because only humans can ask: ‘What is the truth?’, ‘Who has the right to a voice?’, and ‘How does this text affect society?’. The event underscored that ethics, transparency, and the preservation of the human element are critical in the journalistic process, even in an AI-driven environment [1][3][5].
AI as a Supporting Tool: Practical Applications in News Production
AI is already being used in many newsrooms as a supportive tool, particularly in the early stages of news production. An example is the use of generative AI within Google Meet via Gemini for real-time translation across more than 65 languages and automatic meeting notes [2][3]. These features are available to subscribers of Google Workspace Business and Enterprise, enhancing the efficiency of internal communication and international collaboration. At the JOURN-AI 2025 event, it was explained that such tools can help journalists rapidly summarise interview recordings, translate foreign sources, and structure large volumes of information — freeing up time for deeper investigation and analysis [3]. The focus is therefore not on replacing the journalist, but on strengthening professional expertise. ‘AI should be a partner, not a replacement’ — a statement echoed by an expert during the event [5].
The Threshold of Dependence and the Need for Strategy
Although AI offers opportunities, the risk of dependency looms large. The discussion at JOURN-AI 2025 revealed that many newsrooms still lack the strategic knowledge to responsibly deploy AI. The issue is not using AI, but the lack of investment in knowledge, tools, and policy. Journalism often underinvests in training, audit tools, or ethical guidelines for AI use, leaving organisations reliant on external platforms and models [4]. This is an avoidable risk, as the consequence is that companies and tech giants use journalistic content to train their own AI systems — without compensating the media [4]. The solution lies not in resisting AI, but in developing in-house AI strategies, building internal capabilities, and establishing transparent usage frameworks [1][3].
The Future of Reliable Information in an AI-Driven Era
The future of reliable information depends on how well journalists learn to integrate AI without abandoning their core values. JOURN-AI 2025 made it clear that this is not about resisting technology, but about reclaiming control — through knowledge, ethics, and transparency. The organisation announced that all AI-generated articles produced as part of the event will be labelled with an ‘AI-content’ tag, providing a practical example of how transparency should be implemented [5]. Furthermore, it was stressed that it is vital to remember AI has no senses of its own, cannot interpret emotional context, and lacks ethical values — something only humans can provide [1][5]. The future of news, therefore, does not depend on avoiding AI, but on trusting the human role as curator, critical thinker, and truth-seeker.