Information for researchers – guidance on the use of generative AI in research is available

Generative AI (artificial intelligence powered by large language models) can manipulate and generate text and media in response to arbitrary instructions. These new capabilities offer researchers opportunities and risks.

In recognition of these challenges, we have developed a guidance note outlining the University’s position on the use of generative AI in research, to inform understanding about the obligations associated with the use of this technology.

Areas covered include the use of generative AI for research publications, theses, peer reviews, and grant applications.

Next year, researchers will be invited to workshops offered by various faculties and the Centre for Applied AI to learn more about the role of generative Al in research. Details about the workshops will be shared on the staff intranet. A hands-on introduction to large language models like Bing Chat and ChatGPT, featuring Dr Brian Ballsun-Stanton, Solutions Architect (Digital Humanities) from the Faculty of Arts, is now available to view.

Resources and workshops are listed on the Research Data Management Sharepoint, where you can register for updates about generative AI training.

If further information is required about responsible use of generative AI in research, contact your faculty research integrity advisor.

View the guidance note here.

Date:


Share:


Category:


Tags:


Back to homepage

Got a story to share?


Visit our contribute page >>