Your guide to using GAI responsibly

Here are seven principles you need to keep in mind when you use GAI in your work.

These are general principles that will be further developed.


Definition of GAI

Generative artificial intelligence (GAI) tools use vast amounts of data and machine learning techniques to produce content (text, image, sound or video) based on prompts (questions or commands) written by users.

1. Don’t use GAI for confidential or sensitive data

When you use GAI applications, it’s important to keep data security and GDPR rules in mind.

This means that you may not use GAI for anything involving trade secrets, confidential or sensitive data or copyrighted material.

However, you are allowed to use GAI for data classified as ‘public’ or ‘internal’.

2. Remember that you are responsible for the quality of the content you use or share

When you use GAI to generate a text or an image, you are responsible for ensuring the accuracy and quality of the content.

You should always make sure that any GAI-generated content you use is correct before using or sharing it.

3. Credit your use of GAI so that others know how and why you have used it

AU’s credibility in relation to its stakeholders depends on openness and transparency about our use of GAI.

When GAI is used in automated guides and FAQ chatbots, it must be clearly credited so that the user knows that answers are being generated by GAI without prior human quality assurance. All GAI-generated responses must be accompanied by a disclaimer stating the the response is auto-generated and may contain mistakes.

Generally speaking, you should always consider whether crediting your use of GAI is relevant when you use GAI to generate a text, an image, a video or another product.

4. Be critical of GAI as a source - it can make mistakes

Large language models generate content based on their training data without assessing or verifying its accuracy or citing specific sources. This means that GAI is not a source that you can cite as an authority. GAI makes mistakes, and GAI may ‘hallucinate’ sources or facts that aren’t correct just to give you an answer.

This means you should always verify the statements and sources you get from GAI applications – and you should never use GAI as a reference work or search engine.

5. Be aware of bias in GAI-generated content - it will often reproduce biases

GAI-generated content can unconsciously reinforce existing biases and power imbalances, because it selects the most likely outcome based on data that encodes majority perceptions, for example in questions regarding gender, race or other demographic categories.

This means you should always actively assess GAI-generated content to ensure that it does not reproduce biases.

6. Keep in mind that GAI uses a lot of power - it's better for the climate

GAI applications use significantly more power than other online search tools and apps.

Only use GAI when you can’t use ordinary search engine, which use less power, to find answers or perform a task.

7. GAI purchases must be handled centrally - for financial and security reasons

In addition to access to all free GAI applications, all students and staff have access to the version of the Microsoft Copilot which is similar to the free version of ChatGPT.

No university-wide guidelines for which GAI applications staff and students are allowed to use have been adopted, and it has not been decided whether such guidelines will be put in place. If you have a particular need for a specific GAI application, you can discuss your needs with your manager.

However, any purchases of systems and licenses must be carried out in collaboration between the individual unit and AU IT and AU Finance, in order to ensure that such purchases are financially, environmentally and technically sound. 


AU’s position on GAI

  • AU must stay abreast of developments within GAI technologies with an eye to AU’s potential to improve the quality of our work and streamline our work processes.
  • At the same time, AU must act responsibly and transparently – and curiosity about the possibilities of GAI must go hand-in-hand with an awareness of potential risks and critical judgement.
  • GAI technologies are developing rapidly, and are transforming work processes in ways the university does not yet fully grasp. This means that GAI should not be perceived as something to be implemented at AU, but rather as something we ourselves must explore and work with, as staff and managers.

Organisation

The seven guidelines for using GAI were drafted by a working group comprised of:

  • Peter Bruun Nielsen, deputy director, AU IT
  • Trine Graae Lundorf, head of Events and Communication Support
  • Anne Bækby Johansen, head of Aarhus BSS Administrative Centre
  • Jakob Rathlev, deputy director, AU Research

A source of inspiration for the principles are Roskilde University’s principles 

Organisation of work on GAI going forward

A working group will be (re)appointed, which will be tasked with formulating the mandate for a new GAI advisory group – including proposals for participants – which will be discussed and approved by the administration’s leadership team.

The tasks of the GAI advisory group will be to ensure knowledge-sharing and the exchange of experiences across the organisation with a view to learning, competency development, and
ensuring that the structural policy framework, including rules, instructions and guidelines, can be adapted as needed.