Here are seven principles you need to keep in mind when you use GAI in your work.
These are general principles that will be further developed.
Generative artificial intelligence (GAI) tools use vast amounts of data and machine learning techniques to produce content (text, image, sound or video) based on prompts (questions or commands) written by users.
When you use GAI applications, it’s important to keep data security and GDPR rules in mind.
This means that you may not use GAI for anything involving trade secrets, confidential or sensitive data or copyrighted material.
When you use GAI to generate a text or an image, you are responsible for ensuring the accuracy and quality of the content.
You should always make sure that any GAI-generated content you use is correct before using or sharing it.
AU’s credibility in relation to its stakeholders depends on openness and transparency about our use of GAI.
When GAI is used in automated guides and FAQ chatbots, it must be clearly credited so that the user knows that answers are being generated by GAI without prior human quality assurance. All GAI-generated responses must be accompanied by a disclaimer stating the the response is auto-generated and may contain mistakes.
Generally speaking, you should always consider whether crediting your use of GAI is relevant when you use GAI to generate a text, an image, a video or another product.
Large language models generate content based on their training data without assessing or verifying its accuracy or citing specific sources. This means that GAI is not a source that you can cite as an authority. GAI makes mistakes, and GAI may ‘hallucinate’ sources or facts that aren’t correct just to give you an answer.
This means you should always verify the statements and sources you get from GAI applications – and you should never use GAI as a reference work or search engine.
GAI-generated content can unconsciously reinforce existing biases and power imbalances, because it selects the most likely outcome based on data that encodes majority perceptions, for example in questions regarding gender, race or other demographic categories.
This means you should always actively assess GAI-generated content to ensure that it does not reproduce biases.
GAI applications use significantly more power than other online search tools and apps.
Only use GAI when you can’t use ordinary search engine, which use less power, to find answers or perform a task.
In addition to access to all free GAI applications, all students and staff have access to the version of the Microsoft Copilot which is similar to the free version of ChatGPT.
No university-wide guidelines for which GAI applications staff and students are allowed to use have been adopted, and it has not been decided whether such guidelines will be put in place. If you have a particular need for a specific GAI application, you can discuss your needs with your manager.
However, any purchases of systems and licenses must be carried out in collaboration between the individual unit and AU IT and AU Finance, in order to ensure that such purchases are financially, environmentally and technically sound.
The seven guidelines for using GAI were drafted by a working group comprised of:
A source of inspiration for the principles are Roskilde University’s principles
A working group will be (re)appointed, which will be tasked with formulating the mandate for a new GAI advisory group – including proposals for participants – which will be discussed and approved by the administration’s leadership team.
The tasks of the GAI advisory group will be to ensure knowledge-sharing and the exchange of experiences across the organisation with a view to learning, competency development, and
ensuring that the structural policy framework, including rules, instructions and guidelines, can be adapted as needed.