Body
Generative AI can be a powerful tool for your work and studies. However, it's crucial to protect university and personal information. Before you paste any data into a genAI tool, use this guide to understand what's safe to share.
1. Public
This is information that is already publicly available and carries no risk.
- Examples: Course descriptions from the public catalog, university news releases, published research papers.
- AI Rule: Acceptable for Public AI (such as standard versions of Gemini, ChatGPT, Copilot, etc.).
2. Internal Use
This is non-sensitive information for general university operations. Its exposure would cause minor, if any, harm.
- Examples: General departmental memos, non-confidential meeting minutes, internal campus event schedules.
- AI Rule: Acceptable for Public AI, but always double-check that you aren't including any personal names, records, or details unnecessarily.
3. Confidential
This is sensitive data that could cause moderate harm to individuals or the university if disclosed. It requires strong protection.
- Examples: Student records (grades, schedules), unpublished research data, draft grant proposals, departmental budget information.
- AI Rule: Use only in a local, self-contained AI tool. A sandbox is a secure, private AI environment approved by the university where data remains protected. Do not use public AI tools. Consult with IET.
4. Highly Confidential
This is very sensitive data that requires strict controls. Unauthorized access could cause significant financial, legal, or reputational damage.
- Examples: Sensitive personally identifiable information (PII) of students or staff, data from research involving human subjects, major donor information.
- AI Rule: Use only in a local self-contained AI tool with the strictest security controls and permissions in place, This AI instance should be isolated (offline) from the Internet. Consult with IET.
5. Restricted
This is our most critical data, often protected by law (like HIPAA, PCI, or FERPA). Its exposure would result in severe consequences to the university.
- Examples: Academic records, ID numbers, patient health information, Social Security Numbers, credit card or bank account numbers, government-classified research.
- AI Rule: NEVER upload to any cloud-based AI tool. This data must only be stored and handled in specialized, highly-secured university systems.
By avoiding the use of confidential data within AI platforms, St. Edward’s can greatly reduce the risk of exposing confidential information. When necessary, data desensitizing and masking should be performed.
The Golden Rule
When in doubt, don't submit! If you are unsure how to classify a piece of data, treat it as Confidential and contact IET before proceeding.