Everywhere you turn, artificial intelligence, or AI, is being touted as the thing that will revolutionize work.
Text-to-image generation has captured our imagination, and the potential for text-to-video generation has resulted in some very interesting creations that have gained more than their fair share of social media likes. It also creates opportunities for extreme disinformation, something the 2024 election made extremely evident.
Some technology companies now expect programmers to be “vibe coders,” using natural language to describe a project and have the machine write code.
There are even services that let laypeople code. I recently tried my hand at creating an app with a tool called Bubble, and the process has proven both painless and time consuming. I just had to wait for the coding to happen off screen, but the fine tuning of the app still requires significant oversight on my part.
In April, Pew Research Center looked at how the American public and AI experts view artificial intelligence tools and end results of what those tools can help produce. The full results of the survey can be found at https://pewrsr.ch/43E83cK.
Key among researchers’ findings was that AI experts feel more positive and enthusiastic about AI than the public. For example, AI experts surveyed are far more likely than Americans overall to believe AI will have a very or somewhat positive impact on the United States over the next 20 years (56% vs. 17%).
Pew found that U.S. adults as a whole – whose concerns over AI have grown since 2021 – are more inclined than experts to say they’re more concerned than excited (51% vs. 15% among experts).
The study is an interesting showcase of how AI is being marketed, touted in the public eye by the people selling it as the “the future” but the public, your readers, remain just as hesitant as you likely are to adopt these new technologies.
Ultimately, I think this is a good thing as we go back to the initial question posed in the headline: Should your newspaper use AI?
The answer, of course, is, it depends.
At the recent Media and the Law seminar held in Kansas City, one of the panels discussed how AI will help media report the news of tomorrow. Adi Kamdar, a Washington, D.C. attorney on the panel, said AI tools are becoming more the norm. “Everyone is working on artificial intelligence, whether we like it or not.”
Ayan Mittra, another panelist and senior managing editor of local news for The Texas Tribune, said his publication has cautiously experimented with AI, but it is important to know AI’s limitations and capabilities.
AI can create content, but it can’t create knowledge, Mittra said, so consider it for uses that focus on efficiency and accessibility.
Amy Kristin Sanders with Penn State University and moderator of the Media and the Law panel, said newsrooms using AI tools is not new.
“For news organizations, a level of human intervention is a fundamental part to have in place,” Sanders said, later adding, “[Use of AI] does not give us license to stray from the good fundamentals of journalism.”
What does that mean for you? Even if you’re not comfortable disrupting your workflow with AI, it is important you do not ignore it and even familiarize yourself with what it is, for two reasons.
First, as more journalism is created using AI, it is imperative your publication be able to point to the work produced entirely by humans, entirely by AI or that is a product of human and AI collaboration.
Pew’s study found there is a lot of skepticism of industry efforts around responsible AI, with 59 percent of the public and 55 percent of surveyed experts having not too much or no confidence in U.S. companies to develop and use AI responsibly.
Second, the next generation of your newspaper’s employees will be versed in AI’s use. You want to embrace that. Even if you don’t think yourself capable of adapting AI tools to your workflow, these new journalists will be thinking in completely innovative ways.
Attorney Kamdar recommends putting your news organization’s guidelines for AI use on your website in order to be completely transparent with your audience. If reporters aren’t prevented from using ChatGPT or similar AI tools, he said they must make sure they talk with editors to ensure fact checking takes place.
These tools, Kamdar said, “do not change that you have to rigorously fact check and confirm. Regardless of what tools we use, it’s still coming from the publication and readers aren’t going to separate something coming from the publisher and the AI tool.”
In the meantime, if you are already using AI in your organization, do you have an ethics guideline for staff? If not, Poynter created a framework for newsrooms to get started, and it can be found here: bit.ly/43DKII9.
Also, I hope you will consider attending (or sending a staff member) to this year’s Annual Convention and Trade Show. One of our scheduled speakers is Austin Lewter with the Texas Center for Community Journalism, who will be talking about AI’s use in community newspapers.