DC-Journalism Launches 2024/2025 Annual Report on Artificial Intelligence, Journalism

5 min read
Mira milosevic, Executive Director, Global Forum for Media Development

The Dynamic Coalition on the Sustainability of Journalism and News Media (DC-Journalism), a hub for press freedom, media development, and journalism sectors to engage with important Internet governance and digital policy matters within the Internet Governance Forum (IGF), has unveiled its 2024–2025 Annual Report, centred on Artificial Intelligence (AI) and Journalism.

The report is the product of a global collaborative effort that brought together critical reflections and real-world experiences from journalists, researchers, media practitioners, and policy experts grappling with the growing influence of artificial intelligence on the news ecosystem.

The report revealed how AI is transforming every aspect of journalism, from content production and distribution to business models, platform governance, and media sustainability. It brings narratives from across the globe, including stories of journalists leveraging generative AI in emergency reporting, small media outlets in the Global South struggling with the rising cost of AI infrastructure, and grassroots collaborations that centre ethical, inclusive design.

The report captures a rapidly evolving media landscape where artificial intelligence is no longer a futuristic concept but a daily reality shaping how news is produced, shared, and consumed. Through a collection of article abstracts contributed by coalition members, it explores five interlinked areas: AI governance, content production, the economic sustainability of journalism, risks and harms posed by AI, and the growing dominance of tech platforms in shaping media outcomes.

The report examines how AI is being deployed in newsrooms to streamline reporting workflows, enhance audience targeting, and automate editorial decisions, while also raising critical concerns about algorithmic bias, misinformation, and the erosion of editorial independence. These developments, the report warns, have far-reaching implications, not just for journalism, but for democratic societies that depend on reliable, inclusive, and independent information.

What emerges is a nuanced and, at times, sobering picture of the opportunities and dangers presented by AI. According to the report, in many Global South contexts, media outlets are not only struggling to access cutting-edge AI tools but are also being systematically marginalised by algorithmic systems that prioritise dominant market players. The uneven platformisation of journalism, where digital platforms act as powerful intermediaries between news producers and audiences, has deepened existing inequalities in media visibility and financial sustainability.

At the same time, the report shares examples of resistance and innovation: community-led initiatives where journalists, developers, and civic actors co-create AI tools tailored to local languages and contexts; and proposals for international standards to ensure that AI systems support, rather than undermine, the core values of public interest journalism.

The report also delves into the risks and harms associated with the unchecked use of AI in journalism. It highlights how AI-generated content can easily blur the lines between fact and fabrication, especially when deployed without proper oversight or editorial accountability. Deepfakes, misinformation, and algorithm-driven amplification of false narratives present real threats to democratic discourse and public trust.

The report raises alarms about the potential for AI to devalue human-centred journalism by automating news production at scale, often without regard for context, ethics, or impact. There is a particular concern for marginalised voices and communities, whose stories risk being further sidelined by algorithms trained on biased or incomplete data. Without intervention, the report warns, AI could accelerate the erosion of critical reporting, editorial diversity, and the watchdog role of the press.

Despite these challenges, the report is not without hope. It positions AI not solely as a disruptor, but as a tool that, if governed wisely and used ethically, can help revitalise journalism in meaningful ways. Several contributions underscore the importance of multi-stakeholder collaboration in developing AI tools that are transparent, accountable, and centred on public interest values.

The report encourages dialogue between journalists, technologists, policymakers, and civil society actors to co-create frameworks that ensure fairness, inclusivity, and respect for human rights in AI deployment. By prioritising local knowledge, public service journalism, and equitable access to technology, the Dynamic Coalition on the Sustainability of Journalism and News Media envisions a future where AI strengthens, rather than weakens, the foundations of informed societies.

The report is a call to action, urging all stakeholders, from governments and platforms to journalists and civil society, to develop shared strategies that safeguard the integrity, viability, and democratic role of journalism in an AI-driven world. It challenges decision-makers to move beyond reactive policy-making and instead invest in forward-looking frameworks that place human rights, media freedom, and digital inclusion at the centre of AI governance. In doing so, it invites a collective reimagining of how journalism can not only survive but thrive in the face of technological disruption. The future of news, the report suggests, depends not on resisting change, but on shaping it, boldly, ethically, and together.

The report was edited by Waqas Naeem, Daniel O’Maley, Courtney C. Radsch, Aws Al-Saadi, Lei Ma, and Nompilo Simanje, with institutional support from the Global Forum for Media Development (GFMD) and presented at the United Nations Internet Governance Forum in June 2025.