Policy on Use of Artificial Intelligence (AI) in Assessments and Deliverables

Jump below to

  1. Principles of AI use
  2. Classification levels of allowed use of AI in assessments and deliverables
    1. Assessments
    2. Deliverables
    3. Overview
    4. AI-1: Disallowed
    5. AI-2: Restricted
    6. AI-3: Documented
    7. AI-4: Unregulated
  3. Safe and compliant use of AI
  4. References
  5. Appendix

1. Principles of AI use

Generative AI (gAI), or computer systems that can produce content (e.g., ChatGPT, Bard, Bing) in response to natural language queries, is now widely available to produce text, images, and other media. AI is likely to transform many fields and becoming familiar with its strengths and limitations will be as important for health care providers as learning to use the internet to search. To employ AI safely and effectively in health care education and practice, students first require proficiency in the subject area, must accept the responsibility of verifying accurate output of AI models, and should follow processes for documentation of AI use, when appropriate.

A. Proficiency

Learning is not simply memorization of facts—it is building flexible knowledge structures that can be called upon to solve problems and evaluate possible solutions. AI systems represent one source of possible solutions, but to evaluate the value of an AI’s proposed solution, it is necessary to have adequate proficiency in that domain. (See Figure 1.)

B. Verification

Students must take full responsibility for AI-generated materials as if they had produced them themselves. Facts must be true, and assertions must follow from those facts. Generative AI is well-known to output incorrect, misleading, or entirely fabricated information. This limitation is especially important in health care and health professions education, where knowledge forms the basis for decisions that can impact patient health.

C. Documentation

All ideas that are not originally one’s own have a source, and that source generally must be attributed. (See Classification Levels of Allowed Use.) Generative AI may invent sources. Documentation of AI use is always a best practice and may be required. When documentation is required, students are obligated to follow standard practices for documentation. (See Appendix.)

2. Classification levels of allowed use of AI in assessments and deliverables

The following classification scheme will be used for all course assessments and deliverables required for course completion:

Assessments

Demonstrations of knowledge or skill, whether proctored or unproctored, such as summative examinations, formative examinations, objective structured clinical examinations, quizzes, and the like.

Deliverables

Written, oral, or audiovisual assignments or presentations such as reflection assignments, journal club presentations, Discovery Project manuscripts, etc. Instructors and preceptors may provide a blanket classification for all assessments or deliverables for a learning experience or provide separate classification for individual assessments or deliverables.

If an assessment or deliverable does not have a classification provided, it is assumed to be classified AI-1. By submitting an assessment or deliverable for evaluation:

  • Students assert that they have respected all specific requirements of the assigned work, in particular requirements for transparency and documentation of process, or have explained themselves where this was not possible.
  • When use of AI is allowed, students assert that it accurately reflects the facts and that they have verified the facts, especially if they originate from generative AI resources.
  • When use of AI is allowed and documentation is required, students assert that all sources that go beyond common knowledge are suitably documented. Common knowledge is what a knowledgeable reader can assess without requiring confirmation from a separate source.

Overview

AI-1: Disallowed Any use is academic integrity violation Example: summative assessment
AI-2: Restricted Restriction on type of AI resources or aspects of assessment allowed and documentation required Example: journal club assignment
AI-3: Documented No restrictions on AI use but all use must be documented Example: background section of assignment
AI-4: Unregulated No restrictions on use and no documentation required Example: email communication

AI-1. Disallowed

Generative AI tools cannot be used in this assessment or deliverable

In such an assessment or deliverable, students must not use artificial intelligence (AI) to generate any materials or content in relation to the task. Use of AI will be considered an academic integrity violation and will trigger the Policy on Student Misconduct in Academic Studies. Examples of assessments or deliverables that might be classified AI-1 may include, but are not limited to:

  • Summative assessments.
  • Formative assessments and quizzes.
  • Reflections.

AI-2. Restricted

Generative AI tools are restricted for this assessment or deliverable and require documentation

In such an assessment or deliverable, students are restricted in either the types of AI tools that may be used, or on which aspects of the assignment AI may be employed. All use of AI must be appropriately acknowledged. (See AI-3.) The nature of the restrictions should be specified by the instructor. Examples of assessments or deliverables that might be classified AI-2 include, but are not limited to:

  • Journal article assignments where summarization is performed by AI, but assessment of strengths and limitations is generated by the student.
  • Production of summaries of topics that provide a basis for further non-AI-assisted inquiry.

AI-3. Documented

Generative AI tools may be used in any manner for this assessment or deliverable but require documentation

In such an assessment or deliverable, any AI tools may be used on any aspect of the assignment, but all use of AI must be appropriately acknowledged. Examples of assessments or deliverables that might be classified AI-3 include, but are not limited to:

  • Background summaries of topics where the main point of the assignment is generation of a new idea or proposal.
  • Assignments whose goal is to develop skills in using AI-based services.

AI-4. Unregulated

Generative AI tools are not restricted for this assessment or deliverable and documentation of use is not required

In such an assessment or deliverable, any AI tools may be used to assist in any way, and it is not necessary to document or attest to their use. Note that AI products are increasingly integrated into standard software packages (e.g., Microsoft) to provide grammar and spellchecking, and these capabilities will likely increase. Current versions of such products do not require citation.

3. Safe and compliant use of AI

This policy is designed to apply to assessments and deliverables used in the pedagogical process, not as part of direct patient care or human subjects research. Most commercially available AI systems are not compliant with HIPAA or FERPA protections and entry of patient or student information (identified or de-identified) into such systems is a violation of UCSF policies and potentially a crime. Use of AI systems in patient care or research requires direct, positive confirmation from a preceptor or research director that such use is allowed and that the system is authorized to work with such information.

4. References

5. Appendix

Standards for acknowledging use of generative AI, when such use is allowed

When assessments or deliverables are classified as AI-2 or AI-3, documentation as to the manner of gAI use (if used) must be provided. Documentation may take the form of primary source citations, summary statements, or both. Instructors should specify which type of documentation is expected for the assessment or deliverable.

Primary source citation

Assignments may require citation of primary sources. While gAI content may include references to primary sources, the gAI’s output is not reviewed by experts and the source references may not be correct. When primary source citations are specified, all factual statements in the work product that are not common knowledge must have the original source cited per standard citation styles.

Summary statement

For some assignments, inclusion of a summary statement noting the use of the gAI and the manner of its use may be sufficient. Summary statements should note the name and url of the gAI system, the use case of the gAI, the prompt(s) used, and how the gAI’s output was used or adapted:

I acknowledge the use of [insert AI system(s) and link] to [specific use of generative artificial intelligence]. The prompts used include [list of prompts]. The output from these prompts was used to [explain use].

Example:

I acknowledge the use of [1] ChatGPT to [2] generate materials for background research and self-study in the drafting of this assignment. I entered the following prompts on 4 January 2023:

  • [3] Write a 50-word summary about the formation of Monash University. Write it in an academic style. Add references and quotations from Sir John Monash.

[4] The output from the generative artificial intelligence was adapted and modified for the final response.


Policy approved by CEPC October 18, 2023