How can I find research sources generated by a chatbot?
Last updated: May 01, 2024     Views: 27

Artificial intelligence (AI) tools like ChatGPT and Google Gemini will often create citations of research sources that don't actually exist, sometimes referred to as "ghost citations."

Why does this happen? Chatbots like these exist to generate realistic-sounding language: they're not search engines and aren't currently capable of (or good at) producing research-driven writing.

Sources referenced by chatbots may appear plausible at first glance. For instance, they may include the names of real scholarly journals or list authors who have some connection to the topic. The bot may even provide a reasonable-sounding summary! But digging just a little deeper will reveal something like:

  • The title of the source is real, but it was written by a different author and published in a different journal.
  • No journal by that title exists.
  • The journal exists, but has never published an article by the listed author.
  • The volume of the journal cited exists, but does not include the page numbers from the citation. (E.g., the chatbot's citation says the article appears on pages 229-41 of a certain issue, but that issue ends at page 215.)

If you're using a chatbot to assist you with research, always be sure to access and read the actual sources you intend to cite. (See our guide to finding materials from citations.) But remember that there is a strong likelihood that citations from an AI chatbot are fake, and you could face serious consequences for including them in a research assignment.

For help locating sources on a research topic, please don't hesitate to visit the Reference Desk or get help by phone, email, text, or 24/7 chat. (Our chat is always staffed by real people, not bots!)

For an example of this problem and its consequences, see: