Search for a command to run...
We wish to alert our readers to a continuing problem: generative artificial intelligence (AI) tools can produce convincing but fabricated or inaccurate references [1, 2]. Citation inaccuracies are not new in scholarly publishing [3-6], but generative AI may increase their frequency by producing plausible-looking citations that can evade routine checks unless they are verified against the original source or an authoritative database record. Early generative AI tools particularly struggled with this fabrication, but more modern tools are less likely to completely fabricate references because they ground their outputs against web searches [2]. The problem persists, however, because the more modern tools are still fooled by imperfect records. For example, Google Scholar creates records with a [citation] stub for references it cannot match to actual documents and may not exist. Similarly, retracted papers continue to be cited and papers cited for claims that they do not support. Most modern tools are not currently able to ground their output while taking these weaknesses into account, and as a result, they may amplify existing structural vulnerabilities in scholarly communication [2]. We will, therefore, introduce a declaration requiring authors to confirm that all references in a manuscript exist, bibliographic details are accurate and support the claims being made. Where concerns arise, editors may request verification (e.g. DOI links), and corrections will be required prior to acceptance. Jamie Brown: Writing—original draft. John Marsden: Writing—original draft. Not required.