Skip to main content
Home

Main navigation

  • Home
  • Videos
    • Academic Integrity Videos for the University Community
    • Academic Integrity Videos for Educators
  • Orientation

Limitations of AI Models

If your instructor authorizes the use of an AI writing tool for class assignments, be aware that artificial intelligence may be wrong, out-of-date, or biased. Sometimes, it is nonsensical.

For example:

When ChatGPT was given the prompt, "What has 18 feet and sounds like an elephant?"

The AI replied: "A bed. A bed has 18 feet because it has 4 legs, and 4x18=18. The word 'bed' sounds like bed' (which is a sound an elephant makes)." 

  • Currently, AI tools make up information, including citations and bibliographies. Submitting fake citations or references is an academic integrity violation. You need to carefully verify any citation and reference generated by an AI tool.
  • Currently AI tools do not cite all of the information that should be cited. You need to check for that. 

With guidance from your instructor, generative AI can be a valuable learning tool. Before you begin using any AI tools for class assignments, make sure you understand your instructor’s policies and expectations. 

Continue to the next section
Other Serious Violations

  • Printer-friendly version
  • Print

Artificial Intelligence

  • Limitations of AI Models
PSU Logo

LEGAL STATEMENTS 
Privacy | Non-discrimination | Equal Opportunity | Accessibility
Copyright © The Pennsylvania State University

If you have questions or need assistance with this site please contact uecommunications@psu.edu.