Editor’s note: ChatGPT wrote the headline for this story and generated the image on this page. Background images were created with Adobe Firefly.
A few weeks ago, we asked the ISWNE Hotline for editors to tell how they’re using AI in the newsroom. A lot of people responded, some aggressively using AI while others remain reluctant to pursue its newsroom use.
Of course, some AI tools have been around a long time, including spellcheck software and early grammar-checking software.
For the past year or so, we’ve been experimenting with more advanced AI and its potential to help manage the flow of news on a consistent basis.
We’ve found some things it’s very good at, but it does have some flaws.
Like a lot of small-town newspapers, we face financial and staffing limitations. Even if we could afford more staff, finding and keeping people is increasingly difficult. Plus, we publish five weekly papers and while each has its own editor, we do share resources and help support each other.
So, our experiments with AI have been to find ways to be more efficient with the weekly workload, especially routine duties. In that, AI is proving to be a great tool, and we believe it has a lot of potential for the future at most weekly newsrooms.
Here are some of the ways we’ve been using and experimenting with AI, starting with some simple examples working up toward our more complex and problematic experiments:
Transcribing/Otter
Like many of those who commented on the ISWNE Hotline, we too use Otter and other transcription software (iPhones now have that ability) to get quotes and to recall other comments from public meetings. We record all meetings we attend, as do all our editors and reporters. The problem is sound quality; it is often poor and the transcription becomes muddled. Still, as more local governments set up their own recording devices for audio and video, the quality may improve, especially if we can link our recording into those systems. (More about using audio transcriptions later.)
Simple editing
We get a lot of emails with data in a variety of formats and conditions. We’ve found that AI is excellent for processing this information and is much faster than doing it manually. (Note, we’re currently using ChatGPT with the upgraded paid version because it is more robust. There may be better AI software for newspaper use, but this was a way to start.)
Boots-on-the-ground example: If you’re in the United States, you may receive college/university dean’s/president’s/graduates lists from Merit Pages. They often capitalize the degree names (Bachelor of Arts in Journalism) and include the ZIP code for each student. Depending on the college, that can be over 100 students in our area.
Copying and pasting that copy with the instructions to “De-capitalize non-proper nouns and remove ZIP codes,” Chat can edit in 30 seconds what would take us 10 minutes to do manually or through the “Find and Replace” function on Microsoft Word and similar software. Saves time and frustration.
Another example is that Chat can refine submitted copy and even enhance it. We recently had a story submitted by a school about some awards local students had won. We put it into Chat to clean it up and to our surprise, Chat recognized the award program, linked to the state department of education website and pulled in relevant background about it.
One caveat about this, however, is that Chat tends to be too rah-rah promotional and needs command language to force it into third-person and tamp down extraneous language.
Extracting data from flyers
ChatGPT is good at extracting the words from a picture flier and creating an announcement from it (we get these because people make them to post to Facebook and other social media.) The command you give Chat is key on this. For example, you could upload a flyer for a church open house announcement and give Chat the command to: “Write an announcement from the information in this flyer using Associated Press style and in third person.” Again, this saves time and frustration, especially on deadline. (Note: Chat will also extract data from PDF files and clean it up to remove line breaks and it will also read PowerPoint files. We’ve not tried Excel files, but it will likely read those as well.)
Sports data
In Georgia, some sports upload results into a central database (especially for track & field and cross-country). That data can be copied and pasted into Chat and cleaned up by removing extraneous information and showing local schools’ results. It can also analyze sports data if you have that information available.
Headlines
Stuck for a headline? Put the story in to Chat and tell it to give you three suggestions.
It does a fairly good job with that. If you don’t like the first draft, you can tell Chat to tweak it or focus on a particular aspect of the story. Good for when you’re tired and on deadline.
Cutlines
AI is proving invaluable in this, especially for those routine school photos that we have submitted each week (we have three school systems and a multitude of schools in our county and they compete to see who can send in the most school news). Often, the photos are of kids doing something, but no names. We put those photos into Chat, tell it what the school name is and what the activity is and it will spit out suggested cutlines for each photo.
The most interesting example of this was a local high school sent its band to London to participate in that city’s New Year’s Day parade. We received a bunch of photos of the kids marching through the streets of London and some of them posing around the city, but no specific names. We put the photos (12-15 of them) into Chat, told it the school’s name and what the event was and asked for cutlines. We got some terrific cutlines back within seconds; it even recognized one of the specific locations in a photo and named that in the cutline.
Copy editing
We all make writing mistakes and typos. We’ve found that putting a story, or even an entire page, into Chat and asking it to look for errors and typos helps. It will also make grammar suggestions (in bold) and sometimes will improve the flow of an article. While some may bristle at the idea of AI doing something traditional editors have done, we believe this is the trend for the future, especially as AI gets trained. Plus, there’s really no downside. You can use some of the suggested changes or not.
Art
We’ve not really experimented much with using Chat to create images, but plan to do so for creating editorial cartoons to go with some editorials.
Now for the more advanced uses we’re experimenting with:
Data stories
One of the things most reporters hate are budget, audit and tax stories. They’re often complex and involve a lot of numbers and calculations. This is an area Chat excels in.
There are two ways to go about this: One is to drop a PDF (or other format) of a budget or audit into Chat and tell it to analyze the data and give you a summary of the major datapoints and from that, you can write a story. The second way is to drop the data into Chat and tell it to write an AP-style story and see what it gives you back.
We did this recently with an old city budget (64-page PDF) we had on hand to see what would happen. It gave us a story, but we didn’t think it had the right focus. We then told Chat to rewrite the story and “add more details to the above story including millage rate data and additional percentage increases or decreases in data.” Chat then rewrote the story with more of a focus on the millage rate and additional percentages of how the budget had changed from the year before. It wasn’t a bad story; at the very least, it would give someone a starting point to write from.
In a related experiment, we dropped a PowerPoint audit summary into Chat from one of our local towns and told it to write a story. It wrote a great summary of the audit’s highlights and did so in a few seconds. Chat really does do data well and fast if you give it the right command.
Crime processing
Crime processing takes a lot of time for us. Our local sheriff’s department makes approximately 100,000 reports a year, and that’s not including the various city police departments and the arrest log. It’s a lot to sort through on a weekly basis. We’ve been experimenting with using ChatGPT to handle some of the more routine aspects of crime reports, but we’re still just in the experimenting phase for that.
Meeting stories
This is where Chat begins to falter. Some of that is in the commands given to Chat; the more specific and detailed, the better the outcome will be.
There are a couple of ways we’ve experimented with this idea. One has been to feed the minutes from a meeting into Chat and tell it to write an AP-style third-person story. Chat can do that, but the quality depends on the complexity of the meeting.
In a multi-topic meeting, if you tell Chat to lead with a specific issue, that helps it focus. But minutes are often sparse and filter out controversy, so it’s not the same thing as a reporter covering a public meeting. Still, if like us you have multiple meetings on the same night and have to rely on minutes to backfill, Chat could be a way to quickly get a draft done which can then be refined. (An interesting experiment we’ve not done would be to feed a year’s worth of a government’s minutes into Chat and ask it to look for trends.)
Beyond minutes, we wanted to see if Chat could take a transcript from a recording and write a news story based on that. This is where AI struggles. For one thing, as mentioned above, transcripts are often muddled because the initial recording wasn’t very clear. Chat can figure out some of that by context, but not everything.
Recently, we’ve tried this with two very different meetings.
The first was a single-issue public hearing by a small-town council about a controversial new tax law. We covered the meeting and wrote a story. Later, we put a transcribed recording of the meeting into Chat, told it the name of the city and the topic being discussed, then told it to write a story. We wanted to see how close the Chat version would be to the version we had written.
The short answer is, better than we anticipated. Chat did identify the topic as having been confusing, which we also mentioned in our story’s lead. And it was able to filter out some of the extraneous comments from the audience sitting nearby and pull information from the actual discussion.
Still, there were several aspects to the story that Chat didn’t know but we did, like how the discussion fit in with similar discussions being held by other local governments and the fact that for that community, the financial impact would be very small anyway. In other words, Chat couldn’t give context to the story that we could.
In addition, Chat couldn’t identify the names of those speaking, although it was able to distinguish by context if a speaker was a public official or a citizen.
The second experiment was more complex. We covered a county government meeting that had a lengthy agenda including many zoning issues. We wrote two stories from the meeting, one about the zoning and another about other issues that had been voted on by the board.
Later, we put the lengthy agenda (274-page PDF that had supporting documents) that had been supplied by the county and a transcript of our recording of the meeting into Chat and told it to write two stories, one about the zoning issues and another about the rest of the agenda. The idea was to see how closely Chat matched our “human-written” stories.
The first version of the Chat stories was boilerplate and read like minutes with no real lead and little detail.
We then told Chat to rewrite the stories and pick out a main lead and to expand on that topic with more details. This time, Chat did a pretty good job with the general, non-zoning story and had a lead. But it wasn’t the lead we had used and it didn’t summarize the other issues very well.
Likewise, for the rewritten zoning story, Chat did have a lead in the second version, but it was the wrong issue to focus on. We then went back a third time and told Chat to rewrite the zoning story and gave it a specific lead we outlined. That time, Chat did write a good lead, but it also created a major problem − it made up a couple of quotes and inserted them into the story. It just made them up. While Chat had access to the transcript of the meeting to query, those quotes were not in that.
We’ve seen this with Chat a few other times as well, so for future commands, we will include verbiage for it to not create any quotes. (We suspect this tendency by Chat to make up quotes is related to its use to write promotional materials for social media. It’s a bad tendency for newspapers and perhaps it will “unlearn” that habit as we give it more commands.)
Summary
Overall, Chat (and maybe other AI software) can be a huge help today with routine newsroom processing duties and it can help copy edit, write headlines and cutlines. It can also help analyze large data sets, such as budgets and audits and synthesize that into a summary or perhaps even a full story if given a good command or series of tweak commands.
As for news writing from public meetings, Chat has a lot of promise, but is not ready for prime time if the only data it is fed is a transcript from a meeting. It can do OK with minutes from meetings if there is no other choice.
Another hurdle will be to train AI to recognize what is important in a meeting and to create a lead. That’s possible by delineating some criteria for it to follow, or simply picking a lead based on the agenda and putting that into the command.
As AI learns from these exercises, we expect it will get better. One area we’re exploring is to see if we can make our entire digital database of past stories in BLOX available for Chat to query. That knowledge base would be very helpful to have as a real time resource that AI can quickly scan and find relevant data to give more context to current stories.
Based on our experiments and use, we believe that in the coming years, AI will become as big a revolution in our newsrooms as the transition from hot to cold type was in the 1960s and the transition to computer-digital layout was in the 1990s.
Mike Buffington is co-publisher of Mainstreet Newspapers and was the 2004 president of the National Newspaper Association. Alex Buffington is news editor of the group’s flagship newspaper, The Jackson Herald. Email Mike at mike@mainstreetnews.com. This column was originally published in the March issue of the International Society of Weekly Newspaper Editor’s newsletter and is reprinted here with permission.