New Report Lays Out Steps for News Organizations Developing AI Strategies

0
300
Andrew Cochran
JournalismAI

JournalismAI, a global initiative established to empower news organisations to use artificial intelligence (AI) responsibly, has published a new report in which it lays out steps that news organizations should adopt in developing AI strategies.

The report was published in September in collaboration with Polis, the media think tank of the London School of Economics and Political Science (LSE), based in the Department of Media and Communications and supported by the Google News Initiative (GNI).

Titled: “Generating Change: A global survey of What News Organisations are doing with Artificial Intelligence”, the 90-page report is based on a survey of 105 news and media organisations, small and large, across 46 countries regarding AI and associated technologies.  It includes the perspectives of more than 120 editors, journalists, technologies and media makers.

The survey report found that AI continues to be unevenly distributed among small and large newsrooms and regionally among Global South and Global North countries and that the social and economic benefits of AI are geographically concentrated in the Global North, which enjoyed good infrastructure and resources, while many countries in the Global South grapple with the social, cultural, and economic repercussions of post-independence colonialism.

According to the report, more than 75 percent of respondents in the survey use AI in at least one of the areas across the news value chain of news gathering, production and distribution.

It said newsrooms have a wide range of approaches to AI strategy, depending on their size, mission, and access to resources and that some early adopters were currently focusing on achieving AI interoperability with existing systems, while others had adopted a case-by-case approach, and some media development organisations were working towards building AI capacity in regions with low AI literacy.

The report identified financial constraints and technical difficulties as continuing to be the most pressing challenges for integrating AI technologies in the newsroom but said ethical concerns remained significant for the respondents.

It also said cultural resistance and fears of job displacement and scepticism of AI technologies cannot be discounted and that across the board, the respondents noted that mitigating AI integration challenges required bridging knowledge gaps among various teams in the newsroom.

According to the report, more than 60 percent of the respondents are concerned about the ethical implications of AI integration for editorial quality and other aspects of journalism as journalists are trying to figure out how to integrate AI technologies in their work while upholding journalistic values like accuracy, fairness, and transparency.

It said the respondents called for transparency from the designers of AI systems and technology companies, and the users, namely newsrooms, with their audiences while journalists and media-makers continued to stress the need for a “human in the loop approach”.

The report noted fears expressed that AI technologies would further commercialise journalism, boosting poor quality and polarising content, leading to a further decline in public trust in journalism.

It said:  “Tech companies are driving innovation in AI and other technologies, but survey participants voiced concerns about their profit-driven nature, the concentration of power they enjoy, and their lack of transparency.”

The report noted that the participants expect AI to influence four main areas, including fact-checking and disinformation analysis; content personalisation and automation; text summarisation and generation; and using chatbots to conduct preliminary interviews and gauge public sentiment on issues.

It outlined the following six steps towards an AI Strategy for news organisations:

·         Get informed. Including by reviewing the LSE JournalismAI website for online introductory training, the AI Starter Pack, a Case Study hub and a series of reports on innovation case studies as well as other sources that are available.  It provides such materials in a “Readings and Resources section;

·         Broaden AI literacy. Including understanding the components of AI that are impacting journalism the most as it will impact on everyone’s job – not just editorial, and not just the ‘tech’ people.

·         Assign responsibility. Someone in the organisation should be given the responsibility of monitoring developments both in the workplace but also more widely, such as assigning AI innovation and Research and Development leads, and keeping a conversation going within the organisation about AI.

·         Test, iterate, repeat. Experimenting and scaling but always with human oversight and management and not rushing to use AI until the organization is comfortable with the process. In addition, the organization should always review the impact.

·         Draw up guidelines. The guidelines can be general or specific but it is a useful learning process when done inclusively to engage all stakeholders. The organization should also be prepared to review and change them over time.

·         Collaborate and network. There are many institutions such as universities or intermediaries like start-ups who are working in this field. Talking to other news organisations about what they have done is advised. It also notes that Generative AI technologies may present new opportunities for newsroom collaboration given the high enthusiasm about and accessibility of genAI tools.