AI + Working
Cornell approaches AI in the workplace with the same care, curiosity, and rigor that define the university’s broader mission. Across campuses and units, teams are leveraging AI as a tool to enhance human expertise — streamlining routine tasks, strengthening decision-making, and opening new avenues for creativity and collaboration.
The AI Innovation Hub is a collaborative space for the Cornell community to explore and experiment with Generative AI. Here, faculty, staff, and students can build and test practical AI tools and applications that improve university operations and support Cornell’s mission.
Cornell Supported AI Tools
Generative AI (GenAI) can accelerate learning, creativity, and research — but must also be used responsibly. Using tools vetted by Cornell isn’t just a technical preference. New GenAI tools undergo a rigorous check for data privacy, security, accessibility, and alignment with university standards. This process helps to safeguard sensitive information, support ethical practice, and strengthen trust across our community.
Additional GenAI tools are currently under review or in early testing. These include Claude.ai, Claude Code, Google Gemini, and ChatGPT Edu. Contact the AI Innovation Hub to inquire about these or other tools, and to get advice about using AI to solve your challenges.
Cornell AI Platform
Cornell's AI Platform is a secure, private "sandbox" for accessing and experimenting with AI tools that comprises two complementary modules: AI Gateway and AI Agent Studio.
AI Gateway allows you to access frontier models such as Google Gemini, Anthropic Claude, OpenAI GPT, and experiment with other models like Mistral, AWS Nova and Grok, all in a secure Cornell environment. It also enables you to power tools like Claude Code and OpenWebUI, and to seamlessly integrate AI into your custom scripts and application development.
AI Agent Studio allows teams to build, manage, and govern AI agents and workflows that run on schedules or respond to specific triggers, while providing observability and auditing tools.
The AI Platform is currently in a pilot phase, and we are actively seeking collaborators to help shape the future of AI at Cornell.
Microsoft Copilot Chat
The universitywide “private” tool provides access to OpenAI GPT models, in addition to OpenAI’s DALL-E vision model. It enables faculty, staff, and students who are 18 years of age or older to experiment with GenAI text, image, and coding tools without storing the person’s login and chat data or using that data to train the large language models. However, it does use Microsoft’s public search engine where privacy is limited. For this reason, only enter low-risk data — information that the university has made available or published for the explicit use of the general public. If you need a tool that enables you to use higher levels of data, see Ideas, Requests, and Oversight of AI at Cornell for possible next steps.
Adobe Firefly
(Available with an Adobe license)
Firefly allows you to generate images from text, then manipulate and edit them. Coming soon: generative voice and video content.
Zoom AI Companion
The Zoom AI Companion gives hosts and participants shareable meeting summaries and next-steps lists, “highlight reels” in recordings, catch-up options for people joining a meeting late, and more. Currently, the university is evaluating security and privacy considerations for these new features.
Tools and Resources
Whether you’re looking to refine a prompt, explore new use cases, or talk through an idea with an expert, these resources connect you with the guidance and hands‑on support you need.
AI Innovation Hub
The AI Innovation Hub is a collaborative space for the Cornell community to explore and experiment with Generative AI. Here, faculty, staff, and students can build and test practical AI tools and applications that improve university operations and support Cornell’s mission.
AI Exploration Series
AI Exploration Series (via Zoom)
Join AI Assistant Program Director Ayham Boucher for the debut of a bi-weekly series for Cornell students, faculty, and staff who want to know more about all things AI. The 30-minute workshop is held over Zoom at 2 p.m. ET.
Community Learning
Learn how others in the Cornell Community are leveraging AI tools in creative ways. We highlight interesting stories, lessons learned, and best practices shared by our AI experts. It’s a great opportunity to share your learnings with the community too.
GenAI at Cornell on Teams
Effective Prompts
Crafting an effective prompt is not the same as searching the web. Here are some tools to improve your prompting skills.
Guidelines and Best Practices
Cornell’s guidelines seek to balance the exciting new possibilities offered by these tools with awareness of their limitations and the need for rigorous attention to accuracy, intellectual property, security, privacy, and ethical issues. These guidelines are upheld by existing university policies.
When exploring AI tools, it is important to make informed choices about which tools we use and whether they provide privacy and protection of an individual’s personal information and institutional data. Free AI tools that are not offered by Cornell do not provide any material protection of data and should not be used to share or process academic or administrative information.
Accountability
You are accountable for your work, regardless of the tools you use to produce it. When using GenAI tools, always verify the information, check for errors and biases, and exercise caution to avoid copyright infringement. GenAI excels at applying predictions and patterns to create new content, but since it cannot understand what it produces, the results are sometimes misleading, outdated, or false.
Confidentiality and Privacy
If you are using public GenAI tools, you cannot enter any Cornell information — or another person’s information — that is confidential, proprietary, subject to federal or state regulations, or otherwise considered sensitive or restricted. Any information you provide to public GenAI tools is considered public and may be stored and used by anyone else.
As noted in the University Privacy Statement, Cornell strives to honor the Privacy Principles: Notice, Choice, Accountability for Onward Transfer, Security, Data Integrity and Purpose Limitation, Access, and Recourse.
Use for Administration and Other Purposes
Cornell is aiming to offer or recommend a set of GenAI tools that will meet the needs of staff doing administrative work, while providing sufficient risk, security, and privacy protections. The use of GenAI for administrative purposes must comply with the guidelines of the Cornell Generative AI in Administration Task Force Report (January 2024).
Tools Under Review
Cornell is actively assessing additional GenAI tools to determine which meet our standards for privacy, security, accessibility, and responsible use. Each platform undergoes careful review to ensure it protects sensitive information, supports academic integrity, and aligns with the university’s values.
Contact the AI Innovation Hub to inquire about these or other tools, and to get advice about using AI to solve your challenges.
Anthropic Claude
Anthropic Claude tools include Claude.ai, Claude Code, and Claude Cowork. Cornell is working with Anthropic to shape an offering tailored to higher education, with a focus on meeting the university’s data privacy and security requirements.
OpenAI ChatGPT EDU
We’re working with OpenAI to onboard ChatGPT EDU as a private workspace for Cornell University. We will be sharing more information as this offering becomes available.
Github Copilot
GitHub Copilot is an AI-powered pair programmer developed by GitHub (owned by Microsoft) in collaboration with OpenAI. It assists developers by providing autocompletions and code suggestions while they write code.
Google Gemini
AI integration with Google Workspace, including Deep Research and Gems. The university is evaluating the privacy, security, licensing, potential cost impacts, and responsible use.
