Care Health

Prioritize Healthy life

Tip sheet: Harnessing ChatGPT for good journalistic use

Tip sheet: Harnessing ChatGPT for good journalistic use

Tip sheet: Harnessing ChatGPT for good journalistic use

Photo by Andrew Neel via Pexels

In mid-July, AHCJ hosted a webinar with Alex Mahadevan, the director of MediaWise at the Poynter Institute, on how journalists can put ChatGPT to good use. We had a great turnout, with more than 80 participants watching live. Here, I’ll cover the highlights from Alex’s presentation and from another webinar on ChatGPT hosted by the Online News Association. Links to both recordings are in the resources section below. 

For anyone unfamiliar with ChatGPT, it’s essentially a chatbot — an artificial intelligence computer program that simulates human conversation — that learns as it goes. A user can type in a question or command (i.e., “What is a recipe for Thai food I can make?”) and the program will pull from internet searches, articles, research papers and more and spit out what it thinks is the best answer. The more specific the request, the more likely it is to deliver the type of information the user is seeking.

How journalists and newsrooms use chatbots varies, partly because of news stories highlighting errors the tool makes. There is a negative view overall and some hesitancy for them to admit using these programs, Mahadevan said. But it can be useful for some functions such as generating Freedom of Information Act requests, he added. 

In a separate webinar hosted by the Online News Association, Ernest Kung, AI product manager for the Associated Press, noted that so-called generative AI (tools like ChatGPT), as the technology exists currently, “has an extraordinary capacity to be wrong. And it can be confidently wrong. That means we need to be very, very careful in how we approach [it].”

Kung and other panelists suggested one of the best uses of generative AI tools is for data-heavy tasks such as compiling lists of upcoming town council meetings or providing weather updates. Kung discussed five projects using generative AI, supported by the Knight Foundation, in which AP is helping local newsrooms with small staffs:

  • Michigan Radio, a public radio source in Ann Arbor, is expanding a computer application that creates transcripts of city council meetings to include summaries and keyword identification.
  • WFMZ-TV in Allentown, Pa., is creating a program to sort tips and pitches from the public to determine if they’re newsworthy, and, if so, to add them to the station’s coverage plan.
  • El Vocero de Puerto Rico, a Spanish-language newspaper, is creating a program to quickly translate National Weather Service alerts from English to Spanish to notify constituents, which is critical during hurricane season.
  • The Brainerd Dispatch in Minnesota is building a program to automatically write police blogs.
  • KSAT-TV in San Antonio is building a system to automatically summarize transcripts of recorded videos shot by the station’s photographers and create an article’s initial framework. 

For AI’s current iteration, Mahadevan said, the most helpful, practical and ethical use right now is for data work. 

“It can be a really good starting point for crafting ‘boring’ copy, automating the ‘boring’ stuff you need to work on,” Mahadevan said, using preparing text for a grant application as one example. “It can summarize really complicated research papers, articles, your own work, and can help put things into most of the time what it thinks is simple English for your readers.”

Be cautious

Mahadevan cited a number of important caveats to using AI. For one, the information database for the free version of ChatGPT only goes up to a time period before the end of 2021, so searches for more current information may yield inaccurate results. The paid version covers more recent data. 

ChatGPT also can “hallucinate” information, putting out facts or quotes it thinks are true or are spoken by real people. It can “get really hairy if you are writing about somebody and it hallucinates a fact about that person that is incorrect,” Mahadevan said. Be extremely cautious using ChatGPT for anything that is public-facing. 

“There is a good amount of fact-checking that you need to do for anything public-facing that you use ChatGPT for,” he said, adding that the tool is bad at creative and journalistic writing.

One way to try to circumvent the program fabricating information is to incorporate the phrase “according to” in your search. A recent study found that “according to” prompts successfully directed language models like ChatGPT to ground their responses against previously observed text, according to a story in Johns Hopkins’ HUB magazine. Instead of generating false answers, the models were more likely to directly quote a source, just like journalists.

Another caveat from Mahadevan: Avoid inputting your personal information into ChatGPT, as it’s unclear how or where that could wind up. 

Tips for ChatGPT use

Summarizing: You can use ChatGPT to summarize something you wrote or need to read and digest, or to sum up your interview notes to get you started on a story. Remember that the program is a learning system. A writer in one of my online Facebook writers groups mentioned that a friend of hers asked ChatGPT to review something they wrote. Later, when doing a separate search, ChatGPT pulled information from that person’s story draft, which begs the question if the program would return that same information to another user, sharing that writer’s intellectual property. Mahadevan said he has not heard about that happening, but it can’t be ruled out. 

Saved searches: Each time you interact with the chatbot, it remembers what you had asked before. You can revisit a topic and add more information to generate a more specific answer. It can be an iterative process to generate the exact type of information you want. 

Preparation help: If you are asked to give a talk on a particular topic, ChatGPT could help give you a rundown of what to cover, even with a Q&A session at the end. This could be helpful in situations where an editor asks you to speak with the public or attend events. When writing a contract, Mahadevan asked ChatGPT for a gender and inclusion analysis and received a series of bullet points on what could be covered there. 

Generating FOIA requests: This is tedious work, especially for reporters who request slightly differing pieces of information or documents from separate departments. Mahadevan conducted a search for health code violations collected by the state of Florida, and received enough information in seconds to serve as a starting point to edit and clean up. The tool could also be used to generate templates for requests.

Coding: ChatGPT can offer a way to learn to code, Mahadevan said. For example, if you’re writing a story about hospital administrator salaries in your state and get back data from a FOIA request with misspelled names or street addresses, instead of having to go back through everything or ask a data journalist in the newsroom for help, you can ask ChatGPT to write some code in the programming language R to clean up a column of names that are misspelled. Within a few seconds, the tool can spit out code with an explanation of the code that could be used. 

Interview preparation: ChatGPT could be used to help generate a list of interview questions about the subject you are covering. 

Generating headlines: You can plug a headline or deck into ChatGPT and ask it to generate an SEO (search engine optimization) headline or snippet to generate more web traffic and page views. 

Generating slides: ChatGPT could be used to generate slides for a presentation or talk based on an article or other background fed into it. 

Generating images: Another generative AI program called DALL-E can be used to generate images. You could ask for an image of a doctor examining an X-ray, for example, and the program will pull bits of images from various sites and put together an image for you. Note that in its current iteration, some of the images may look a little wonky (in examples Mahadevan showed, one doctor’s hands looked too big for the person’s body), and you may still have to be concerned about copyright issues. If you use such images in a public-facing manner, make sure to cite that they are AI-generated images.