This page of the guide has been written by IFIS Publishing's Katy Askew, and was added in August 2025.
AI tools can be helpful when conducting literature searches as part of a comprehensive literature search strategy. But remember: never accept an AI output at face value.
Below, we have outlined some key considerations when using AI tools for literature searching.
Important! The technology of generative AI tools and the research community's understanding of responsible use in academic research is constantly evolving. Always be sure to refer to the plagiarism and academic integrity policies at your institution for specific requirements.
Consider whether AI use is right for each stage of your literature searching strategy. Generative AI can be great for getting a quick overview, idea generation, or brainstorming keywords. It can be useful at the beginning of your literature review process in order to help you define your search question and obtain a broad view of the topic and existing literature.
However, for the more in-depth literature searching stage, or when finding specific evidence, a curated database is indispensable. Use AI if it helps, but always follow up with trustworthy sources from the library.
This table outlines the key elements and differences between generative AI tools and curated databases. By understanding these differences, you can ensure that you are using the right resource for the task.
Generative AI (e.g. ChatGPT / Perplexity / Claude / etc) |
Curated Database (e.g. FSTA / NutriHealth) |
|
---|---|---|
Scope of information |
|
|
Quality control |
|
|
Source transparency |
|
|
Search method |
|
|
Comprehensive coverage? |
|
|
It is important to check the plagiarism and academic integrity policies at your institution for specific requirements. It is generally recommended that you are fully transparent about the AI tools you have used.
Using Generative AI without acknowledgement is usually treated as plagiarism, the same as copying someone else’s work. If your instructor allows AI for certain parts of the work (like brainstorming or first drafts), you must still disclose that use and cite any content from it. Presenting AI-generated text as your own work (or letting it fabricate sources for you) violates academic integrity. Universities and libraries are emphasising that students remain responsible for the work they submit.
If you do use a Generative AI tool, get permission if required, use it only as allowed, and always give proper credit. If you’re ever unsure, ask your instructor or a librarian for guidance. It’s better to be safe and transparent than risk your academic reputation.
When citing an answer from an AI tool you should do so in the same way you would cite a personal communication or an unarchived source, unless your style guide provides specific instructions. For instance:
Always check for updated guidelines, as formal citation rules for AI are new and evolving. And remember, if you quote text that the AI wrote, you should put it in quotation marks and clarify in your paper that it came from an AI (just as you would quote and attribute any author).
BEST PRACTICE RECOMMENDATION: Treat ChatGPT as a source that needs acknowledgement – never just paste its output into your essay without citation.
AI tools can produce information that is incomplete, biased, or even incorrect. Consider these 5 questions to critically evaluate any answer you get from an AI tool before relying on it in your academic work.
Follow the steps in this flowchart for a simple, concrete method that you can use to assess the reliability of facts and citations given by generative AI tools such as ChatGPT, Perplexity, CoPilot and many more.