Acceptable uses of generative AI services at IU

On this page:


Since 2020, there has been an exponential increase in the investment and development of generative artificial intelligence (AI) services. Generative AI is a type of artificial intelligence system that generates new text, images, or other media in response to prompts. Notable generative AI systems include ChatGPT, Microsoft Copilot, and Google Bard.


Generative AI has potential applications across a wide range of industries, including art, writing, and software development. However, there are also concerns about the potential misuse of these tools and any data shared with the services. When you provide information to these tools, such as queries, student essays, grant proposals, source code, or datasets, it is the same as posting the information on a public website.

Indiana University encourages its affiliates to experiment with using these generative AI services, as long as no institutional data is submitted to them without approval.

At IU, Microsoft Copilot is available for use by IU faculty and staff, and it is the recommended way to use generative AI within the IU environment. As part of the university's enterprise agreement with Microsoft, Copilot is approved to interact with data classified up to and including University-Internal data. To use Copilot, you must be logged in with your Microsoft 365 at IU account (your email address and your IU passphrase). For information about browser and app compatibility, see About Microsoft Copilot at IU.

To date, no other generative AI tools have been approved for data beyond Public classification, and these have not been through the Software and Services Selection Process (SSSP). Prior to the sharing of any institutional data, these services will need to go through review to ensure the necessary contracts and safeguards are in place to protect the data submitted and to ensure the algorithms in use are ethical, transparent, and beneficial to the IU community.

Types of institutional data that should not be submitted to public versions of generative AI tools, even when anonymized, include:

  • Data classified as University-Internal or higher (for examples, visit the Data Classification Matrix)
  • Any data that may be considered student, faculty, or staff intellectual property, unless the individual submitting that intellectual property created it

Specific examples that are not appropriate for the public versions of generative AI tools include:

  • Sharing names and information about a real student, employee, research participant, or patient
  • Asking an AI service to summarize and grade a student paper or assignment
  • Sharing employee-related data such as performance or benefit information for communication drafting or analysis
  • Asking an AI service to generate code for IU systems protecting institutional data or sharing IU source code for editing
  • Sharing grant proposals still under review

Acceptable uses

With these precautions in mind, there are numerous ways to use generative AI tools without submitting university data or intellectual property. Using general queries to generate content to pull information from the AI resources is a good way to engage with the products.

Students should use generative AI in ways that align with university academic integrity policies and communicate with their instructors before using generative AI in their coursework. Schools and departments may elect to further restrict generative AI.

From a data management perspective, examples of acceptable uses of generative AI include:

  • Syllabus and lesson planning: Instructors can use generative AI to help outline course syllabi and lesson plans, getting suggestions for learning objectives, teaching strategies, and assessment methods. Course materials that the instructor has authored (such as course notes) may be submitted by the instructor.
  • Correspondence when no student or employee information is provided: Students, faculty, or staff may use fake information (such as an invented name for the recipient of an email message) to generate drafts of correspondence using AI tools, as long as they are using general queries and do not include institutional data.
  • Professional development and training presentations: Faculty and staff can use AI to draft materials for potential professional development opportunities, including workshops, conferences, and online courses related to their field.
  • Event planning: AI can assist in drafting event plans, including suggesting themes, activities, timelines, and checklists.
  • Reviewing publicly accessible content: AI can help you draft a review, analyze publicly accessible content (for example, proposals, papers and articles) to aid in drafting summaries, or pull together ideas.

Even if you use generative AI tools for activities that do not share personal or institutional data, you should still check the tool's output for accuracy. Since these tools have been known to produce inaccurate content (sometimes called "hallucinations"), verify any factual information generated by an AI tool, and make sure to reference the tool as you would any other source.

Learn more and get help

If you have any privacy-related concerns about generative AI tools, or questions about the type of data that can be shared with it, email For more information, see:

This is document biit in the Knowledge Base.
Last modified on 2024-02-05 15:26:52.