Science Journals’ Editor-in-Chief on Responsible Use of AI in Research
In Science’s first editorial of 2026, editor-in-chief Holden Thorp reflects on thoughtful AI use in scientific publishing
The first Editorial of the year at Science always gives the editor-in-chief an opportunity to reflect on notable developments for the Science journals. In this Editorial, Holden Thorp focuses on AI, discussing how it “will allow the scientific community to do more if it picks the right ways to use it.”
He revisits the journals’ policies and approaches related to AI.
The journals do use select AI tools. “Science’s most recent policies allow the use of large language models for certain processes without any disclosure, such as editing the text in research papers to improve clarity and readability or assisting in the gathering of references,” he writes.
“However, the use of AI beyond that—for example, in drafting manuscript text— must be declared. And the use of AI to create figures is not allowed.”
“Over the past year, for example, Science has collaborated with DataSeer to evaluate adherence to its policy mandating the sharing of underlying data and code for all published research articles. The initial results are encouraging in that of 2,680 Science papers published between 2021 and 2024, 69 percent shared data.”
| DataSeer is a tool used by science journals to enhance research reproducibility. Its natural language processing technology scans submitted papers and generates a prefilled reproducibility checklist. Authors then review, confirm, and revise the entries as needed, helping ensure that studies meet transparency and reproducibility standards. |
But Thorp notes that “although AI is helping Science catch errors that can be corrected or elements that are missing from a paper […] its use and the evaluation of the output require more human effort, not less.”
This is because the reports generated by AI tools must be assessed by people.
“As a small family of journals that can put more human effort into each paper,” Thorp writes, “the Science journals are less susceptible—and contribute less—to the accumulation of ‘AI slop’ in the literature, but no system, human or artificial, can catch everything. Potential degradation of the literature by technology reinforces the value of a record maintained with human scientific experience and expertise.”
Thorp notes that 15 years ago, some predicted that massive open online courses would threaten universities, but instead they became a key part of education, helping institutions grow. Similarly, moving journals online expanded scholarly publishing rather than shrinking it.
Like other tools, when used wisely, AI can enhance scientific work. “The community needs to be careful and not be swept up by the hype surrounding every AI product,” Thorp concludes.
