Your guide to GAI: Seven principles you should know

Guiding principles for work-related use of GAI (generative artificial intelligence) are now in place for staff and managers. If you follow these seven principles, with a focus on safety, data security and critical evaluation, you can explore the many different GAI applications with confidence.

AI has rapidly made its entry on work life, and this makes it possible, for example, to create images like this. Photo: Digital illustration: Nikolai Lander, AU Design

Lots of AU employees have been using generative AI (GAI) tools in their work for quite a while now. Others have dabbled a bit, and others haven’t yet started exploring the GAI toolbox. But in all cases, the explosion of interest in GAI has led to a demand for a shared framework to guide the use of GAI at AU.

Get GAI right: seven principles

  1. Don’t use GAI for confidential or sensitive data
  2. Remember that you are responsible for the quality of the content you use or share
  3. Credit your use of GAI so that others know how and why you have used it
  4. Be critical when using GAI as a source of information – it can make mistakes
  5. Be sensitive to bias in GAI-generated content – it often reproduces the biases in the data it’s trained on
  6. Only use GAI when standard search machines won’t do – it’s better for the climate
  7. Coordinate purchases of licenses for GAI applications with AI IT and AU Finance – to keep keep costs down and keep our data safe

Get the full version of the principles at medarbejdere.au.dk/en/gai

And now version 1.0 is here: the university has adopted seven principles for using GAI. These principles are aimed at staff and students, and are intended to help you understand and think through the issues at stake when you or your staff experiment with how GAI applications can contribute to your work performance. The principles have been designed to strike a balance between curiosity and awareness of potential risks, explained Peter Bruun Nielsen, deputy university director for AU IT:

"AU employees should never use GAI for confidential or sensitive data, and the accuracy of GAI-generated content should always be confirmed. What’s more, these apps should never be used for for operations that ordinary search engines can do just as well with significantly lower power consumption.” He also emphasised that the principles are not an exhaustive, one-size-fits-all checklist:

“These principles provide some ground rules, but they must be supplemented by common sense and critical judgement. Because GAI technologies are developing quickly and transforming work processes in ways we don’t yet fully understand, we shouldn’t perceive GAI as something to implement– it’s something we need to explore and work with ourselves.”

The seven GAI principles were approved by the administration’s leadership team. They only apply to the use of standard GAI applications in connection with our day-to-day work in a non-research context, where there can be other considerations that are necessary to take into account.

New advisory group tasked with further development of principles

The seven GAI principles will now be communicated throughout the organisation, and will  provide a point of departure for ongoing discussion in management forums and units across the university. The administration’s leadership team will appoint a working group and and advisory group to support the development of guidelines, rules and principles that will keep pace with developments in GAI-based applications going forward.

New rules on using GAI in exams

GAI also has uses in connection with teaching and exams. AU has also just adopted a new set of rules and recommendations for the use of GAI in exams that will apply starting in the autumn semester. The new policy will be communicated to teaching staff and students in August.

All employees have access to Copilot

All AU employees automatically have access to the basic version of Microsoft Copilot (Microsoft’s GAI chatbot). You can sign in with your AU login credentials to access Copilot in the Microsoft Edge browser or at copilot.microsoft.com.

The data you upload/type into Microsoft Copilot will not be saved and will not be used to train the model. Nonetheless, you should never put confidential or sensitive personal information into Copilot or other GAI applications.

Later this year, the administration’s leadership team will announce concrete recommendations for the use of GAI applications at AU on the background of a risk assessment.