AI @ The FAS

Generative artificial intelligence (AI) may prove to be one of the most impactful technologies of our times. The FAS is undertaking a year-long conversation on the topic, paying particular attention to the potential opportunities and disruptions of AI systems across academic and research integrity, pedagogy, and FAS administrative operations. 

We hope that all members of the FAS community will join in this exploration and encourage you to engage with us. Contact generativeAI@fas.harvard.edu for more information.

Areas of Focus

Pedagogy

How can AI advance our pedagogy by enhancing, rather than replacing, human interaction and critical thinking skills?

Academic & Research Integrity

What are the implications of these tools on fairness, honesty, and ethical behavior within the academic community?

Administrative Innovation

How can and should AI impact administrative operations from admissions and hiring to reporting and assessments?

Featured Event: AI @ FAS Symposium May 1

Wednesday, May 1, 2024  
9:00 a.m.—2:30 p.m.  
Northwest Building 

Join us at the AI @ FAS Symposium to learn about the innovative ways faculty, staff, and students are applying artificial intelligence across the Faculty of Arts and Sciences. 

How are FAS faculty using AI in the classroom? 

Last December, the FAS community came together to discover how FAS faculty are using the latest generative artificial intelligence technology in the classroom at the Generative AI Faculty Show & Tell. With introductions by Latanya Sweeney (Harvard Kennedy School) and Scott Jordan (FAS Dean of Administration and Finance) the event featured dynamic live demos and discussions led by FAS faculty members Eric Beerbohm (Government), Maria Dikcis (English), David Malan (Computer Science), and Nicole Mills (Romance Languages and Literatures), to exchange ideas and uncover practical ways to integrate AI into academic work. 

Prof. Eric Beerbohm

Eric Beerbohm is Professor of Government and Faculty Affiliate in the Department of Philosophy at Harvard University. He serves as Faculty Director of the Edmond & Lily Safra Center for Ethics and Faculty Dean at Quincy House. He teaches Democratic theory and tested the use of GAI in his class in multiple ways including: (1) using Chat GPT as a student (asking it questions and learning from its mistakes), (2) unpacking ChatGPT’s ethical guidelines, and (3) using ChatGPT as his co-pilot in teaching to understand what is happening in the minds of all students. 

Dr. Maria Dikcis

Dr. Maria Dikcis is a College Fellow in Media in the Department of English where she teaches English 90RI: Race in the Age of Artificial Intelligence. During the 2023-24 academic year, she is also a Researcher for the metaLAB (at) Harvard, where she is supporting the development of the AI Pedagogy Project.

Prof. David Malan

David J. Malan is Gordon McKay Professor of the Practice of Computer Science at Harvard University in the School of Engineering and Applied Sciences as well as a Member of the Faculty of Education in the Graduate School of Education. He teaches Computer Science 50, otherwise known as CS50, which is among Harvard University’s largest courses. He built a “tutor bot” specifically trained on course materials that students could use as course assistants to provide individual support.

Dr. Nicole Mills

Nicole Mills is a Senior Preceptor in Romance Languages and Literatures and the Joint Director of Language Programs. She teaches French 11 Beginning French II: Paris in Virtual Reality. She experimented with using AI as part of  a beginning French language course where students engaged in a global simulation of life in Paris and created fictional French-speaking characters which they then used to create a collective storyline by writing memoirs from their character’s perspective. 

How can FAS staff leverage AI tools in their work? 

Ismael Carreras, Associate Dean for Strategic Analysis for FAS Administration & Finance, discusses his use of generative AI models like ChatGPT and FAS AI Sandbox for tasks such as drafting documents, summarizing long texts, and creating visuals and code. He discusses one approach that highlights six components when crafting effective prompts to refine outputs, which helps the generative AI model produce useful, high-quality content. 

Exploring Generative AI Administrative Applications  

AI can be a valuable tool for completing various writing and data analysis tasks in the office. Learn more about the ways AI can be helpful for many tasks, such as drafting text like memos and emails, summarizing long documents, generating visuals and acronyms, and writing statistical code. 

Writing Prompts for Generative AI

Writing effective prompts is important when using generative AI models to help generate and refine output. One strategy identifies six key components for crafting a good prompt: task, context, exemplars, persona, format, and tone.

AISWG Executive Committee

The Artificial Intelligence Systems Working Group Executive Committee (AISWG) is intended to provide a platform for connecting stakeholders across the FAS community who are thinking about the risks and opportunities of AI and how we can ensure that the technology advances our academic mission and organizational effectiveness. Informed by presentations from area co-leads, who act as connectors between activities happening on the ground across FAS and FAS leadership, the AISWG will guide FAS strategy and planning regarding AI systems in its teaching, research, and administrative operations, as well as make recommendations on needed policy or process changes, major investments, or other FAS interventions. AISWG will engage AI subject matter experts as needed.

AISWG Executive Committee Members

Committee Co-Chairs

  • Latanya Sweeney, Daniel Paul Professor of the Practice of Government and Technology, Faculty Dean of Currier House
  • Scott Jordan, Dean of Administration and Finance, FAS

Committee Members

  • Chris Bebenek, University Attorney
  • Lawrence Bobo, Dean of Social Science, W.E.B. Du Bois Professor of the Social Sciences
  • Ismael Carreras, Associate Dean of Strategic Analysis, FAS; Area Co-lead, Generative AI Guidance for FAS Administrative Innovation
  • Amanda Claybaugh, Dean of Undergraduate Education and Samuel Zemurray, Jr. and Doris Zemurray Stone Radcliffe Professor of English; Area Co-lead, Generative AI Guidance for Pedagogy
  • Nancy Coleman, Dean of the Division of Continuing Education
  • Nina Collins, Associate Dean and Chief of Staff to the Dean of the Faculty of Arts and Sciences
  • Melissa Dell, Andrew E. Furer Professor of Economics; Area Co-lead, Generative AI Guidance for Academic & Research Integrity
  • Emma Dench, Dean of the Kenneth C. Griffin Graduate School of Arts and Sciences and McLean Professor of Ancient and Modern History and of the Classics
  • Hopi Hoekstra, Edgerley Family Dean of the Faculty of Arts and Sciences, C.Y. Chan Professor of Arts and Sciences and Xiaomeng Tong and Yu Chen Professor of Life Sciences
  • Klara Jelinkova, Vice President, University and FAS Chief Information Officer
  • Robin Kelsey, Dean of Arts and Humanities and Shirley Carter Burden Professor of Photography
  • Rakesh Khurana, Danoff Dean of Harvard College; Area Co-lead, Generative AI Guidance for FAS Administrative Innovation
  • David Parkes, John A. Paulson Dean of the Harvard John A. Paulson School of Engineering and Applied Sciences
  • William Petrick, Associate Dean for Academic & Research Integrity and Student Conduct, Harvard College; Area Co-lead, Generative AI Guidance for Academic Integrity
  • Christopher Stubbs, Dean of Science and Samuel C. Moncher Professor of Physics and of Astronomy; Area Co-lead, Generative AI Guidance for Pedagogy

Additional Resources

Harvard University Information Technology

Harvard University Information Technology (HUIT) compiled a webpage that contains guidelines, tools, news, and other resources about using generative AI at Harvard.

AI Sandbox (HUIT/FAS)

The AI Sandbox provides a “walled-off,” secure environment in which to experiment with Generative AI, mitigating many security and privacy risks and ensuring the data entered will not be used to train any public AI tools. It offers a single interface that enables access to seven different Large Language Models (LLM): Azure OpenAI GPT-3.5, GPT-3.5 16k, GPT-4, and GPT-4 32k; Anthropic Claude 2 and Instant; and Google PaLM 2 Bison. The AI Sandbox is approved for use with Medium Risk Confidential data (L3) and below. Before using the AI Sandbox, please review these guidelines and instructions.

Harvard College Office of Undergraduate Education

The Harvard College Office of Undergraduate Education (OUE) has curated a comprehensive list of resources to aid instructors in the responsible and effective utilization of generative AI within their courses.

Derek C. Bok Center for Teaching and Learning

For educators looking to navigate the landscape of generative AI in the classroom, the Bok Center offers valuable guidance on adapting teaching methods to address the challenges and opportunities it presents.

Division of Science

The Division of Science has gathered an array of resources dedicated to the utilization for scholarship and learning at Harvard.

AI @ Harvard

Harvard faculty, students, and scholars are doing cutting-edge research in data science, machine learning, modeling, data analysis, visualization, and ethics in fields spanning computer science, public health, medicine, law, public policy, business, the sciences, and more.